id stringlengths 2 115 | lastModified stringlengths 24 24 | tags list | author stringlengths 2 42 ⌀ | description stringlengths 0 6.67k ⌀ | citation stringlengths 0 10.7k ⌀ | likes int64 0 3.66k | downloads int64 0 8.89M | created timestamp[us] | card stringlengths 11 977k | card_len int64 11 977k | embeddings list |
|---|---|---|---|---|---|---|---|---|---|---|---|
open-llm-leaderboard/details_openbmb__UltraLM-13b-v2.0 | 2023-10-25T05:11:28.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-09T08:34:36 | ---
pretty_name: Evaluation run of openbmb/UltraLM-13b-v2.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [openbmb/UltraLM-13b-v2.0](https://huggingface.co/openbmb/UltraLM-13b-v2.0) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openbmb__UltraLM-13b-v2.0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-25T05:11:16.252341](https://huggingface.co/datasets/open-llm-leaderboard/details_openbmb__UltraLM-13b-v2.0/blob/main/results_2023-10-25T05-11-16.252341.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.24842701342281878,\n\
\ \"em_stderr\": 0.004425115813837483,\n \"f1\": 0.3269431627516796,\n\
\ \"f1_stderr\": 0.004386855622561775,\n \"acc\": 0.4373652518320964,\n\
\ \"acc_stderr\": 0.010268101875758145\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.24842701342281878,\n \"em_stderr\": 0.004425115813837483,\n\
\ \"f1\": 0.3269431627516796,\n \"f1_stderr\": 0.004386855622561775\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10993176648976498,\n \
\ \"acc_stderr\": 0.008616195587865406\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7647987371744278,\n \"acc_stderr\": 0.011920008163650886\n\
\ }\n}\n```"
repo_url: https://huggingface.co/openbmb/UltraLM-13b-v2.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|arc:challenge|25_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_25T05_11_16.252341
path:
- '**/details_harness|drop|3_2023-10-25T05-11-16.252341.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-25T05-11-16.252341.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_25T05_11_16.252341
path:
- '**/details_harness|gsm8k|5_2023-10-25T05-11-16.252341.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-25T05-11-16.252341.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hellaswag|10_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T08-34-12.309014.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T08-34-12.309014.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T08-34-12.309014.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_25T05_11_16.252341
path:
- '**/details_harness|winogrande|5_2023-10-25T05-11-16.252341.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-25T05-11-16.252341.parquet'
- config_name: results
data_files:
- split: 2023_10_09T08_34_12.309014
path:
- results_2023-10-09T08-34-12.309014.parquet
- split: 2023_10_25T05_11_16.252341
path:
- results_2023-10-25T05-11-16.252341.parquet
- split: latest
path:
- results_2023-10-25T05-11-16.252341.parquet
---
# Dataset Card for Evaluation run of openbmb/UltraLM-13b-v2.0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/openbmb/UltraLM-13b-v2.0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [openbmb/UltraLM-13b-v2.0](https://huggingface.co/openbmb/UltraLM-13b-v2.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_openbmb__UltraLM-13b-v2.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-25T05:11:16.252341](https://huggingface.co/datasets/open-llm-leaderboard/details_openbmb__UltraLM-13b-v2.0/blob/main/results_2023-10-25T05-11-16.252341.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.24842701342281878,
"em_stderr": 0.004425115813837483,
"f1": 0.3269431627516796,
"f1_stderr": 0.004386855622561775,
"acc": 0.4373652518320964,
"acc_stderr": 0.010268101875758145
},
"harness|drop|3": {
"em": 0.24842701342281878,
"em_stderr": 0.004425115813837483,
"f1": 0.3269431627516796,
"f1_stderr": 0.004386855622561775
},
"harness|gsm8k|5": {
"acc": 0.10993176648976498,
"acc_stderr": 0.008616195587865406
},
"harness|winogrande|5": {
"acc": 0.7647987371744278,
"acc_stderr": 0.011920008163650886
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,596 | [
[
-0.030364990234375,
-0.052398681640625,
0.01523590087890625,
0.0175628662109375,
-0.01531219482421875,
0.0098419189453125,
-0.0227203369140625,
-0.012359619140625,
0.0194091796875,
0.040679931640625,
-0.05047607421875,
-0.0712890625,
-0.037078857421875,
0.00... |
RorooroR/JazzHiphop_64 | 2023-10-09T08:51:06.000Z | [
"region:us"
] | RorooroR | null | null | 0 | 0 | 2023-10-09T08:51:06 | Entry not found | 15 | [
[
-0.021392822265625,
-0.01494598388671875,
0.05718994140625,
0.028839111328125,
-0.0350341796875,
0.046539306640625,
0.052490234375,
0.00507354736328125,
0.051361083984375,
0.01702880859375,
-0.052093505859375,
-0.01494598388671875,
-0.06036376953125,
0.03790... |
ZeDzZoo/models | 2023-10-09T08:56:06.000Z | [
"region:us"
] | ZeDzZoo | null | null | 0 | 0 | 2023-10-09T08:51:25 | Entry not found | 15 | [
[
-0.021392822265625,
-0.01494598388671875,
0.05718994140625,
0.028839111328125,
-0.0350341796875,
0.046539306640625,
0.052490234375,
0.00507354736328125,
0.051361083984375,
0.01702880859375,
-0.052093505859375,
-0.01494598388671875,
-0.06036376953125,
0.03790... |
ssfei81/CLIRMatrix | 2023-10-09T10:36:55.000Z | [
"region:us"
] | ssfei81 | null | null | 0 | 0 | 2023-10-09T09:02:45 | Entry not found | 15 | [
[
-0.021392822265625,
-0.01494598388671875,
0.05718994140625,
0.028839111328125,
-0.0350341796875,
0.046539306640625,
0.052490234375,
0.00507354736328125,
0.051361083984375,
0.01702880859375,
-0.052093505859375,
-0.01494598388671875,
-0.06036376953125,
0.03790... |
open-llm-leaderboard/details_Yukang__LongAlpaca-13B | 2023-10-27T22:00:42.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 1 | 0 | 2023-10-09T09:20:16 | ---
pretty_name: Evaluation run of Yukang/LongAlpaca-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Yukang/LongAlpaca-13B](https://huggingface.co/Yukang/LongAlpaca-13B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Yukang__LongAlpaca-13B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-27T22:00:30.556276](https://huggingface.co/datasets/open-llm-leaderboard/details_Yukang__LongAlpaca-13B/blob/main/results_2023-10-27T22-00-30.556276.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.17051174496644295,\n\
\ \"em_stderr\": 0.003851429222727117,\n \"f1\": 0.23656669463087293,\n\
\ \"f1_stderr\": 0.003934121554985558,\n \"acc\": 0.32044198895027626,\n\
\ \"acc_stderr\": 0.006741557601060113\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.17051174496644295,\n \"em_stderr\": 0.003851429222727117,\n\
\ \"f1\": 0.23656669463087293,\n \"f1_stderr\": 0.003934121554985558\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6408839779005525,\n\
\ \"acc_stderr\": 0.013483115202120225\n }\n}\n```"
repo_url: https://huggingface.co/Yukang/LongAlpaca-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|arc:challenge|25_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_27T22_00_30.556276
path:
- '**/details_harness|drop|3_2023-10-27T22-00-30.556276.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-27T22-00-30.556276.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_27T22_00_30.556276
path:
- '**/details_harness|gsm8k|5_2023-10-27T22-00-30.556276.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-27T22-00-30.556276.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hellaswag|10_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T09-19-51.890196.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T09-19-51.890196.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T09-19-51.890196.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_27T22_00_30.556276
path:
- '**/details_harness|winogrande|5_2023-10-27T22-00-30.556276.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-27T22-00-30.556276.parquet'
- config_name: results
data_files:
- split: 2023_10_09T09_19_51.890196
path:
- results_2023-10-09T09-19-51.890196.parquet
- split: 2023_10_27T22_00_30.556276
path:
- results_2023-10-27T22-00-30.556276.parquet
- split: latest
path:
- results_2023-10-27T22-00-30.556276.parquet
---
# Dataset Card for Evaluation run of Yukang/LongAlpaca-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Yukang/LongAlpaca-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Yukang/LongAlpaca-13B](https://huggingface.co/Yukang/LongAlpaca-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Yukang__LongAlpaca-13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-27T22:00:30.556276](https://huggingface.co/datasets/open-llm-leaderboard/details_Yukang__LongAlpaca-13B/blob/main/results_2023-10-27T22-00-30.556276.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.17051174496644295,
"em_stderr": 0.003851429222727117,
"f1": 0.23656669463087293,
"f1_stderr": 0.003934121554985558,
"acc": 0.32044198895027626,
"acc_stderr": 0.006741557601060113
},
"harness|drop|3": {
"em": 0.17051174496644295,
"em_stderr": 0.003851429222727117,
"f1": 0.23656669463087293,
"f1_stderr": 0.003934121554985558
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.6408839779005525,
"acc_stderr": 0.013483115202120225
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,495 | [
[
-0.033416748046875,
-0.051849365234375,
0.017547607421875,
0.02691650390625,
-0.01708984375,
0.002292633056640625,
-0.032196044921875,
-0.021392822265625,
0.03582763671875,
0.043121337890625,
-0.052581787109375,
-0.0697021484375,
-0.047607421875,
0.016403198... |
open-llm-leaderboard/details_IkariDev__Athena-v4 | 2023-10-25T17:00:39.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-09T09:30:04 | ---
pretty_name: Evaluation run of IkariDev/Athena-v4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [IkariDev/Athena-v4](https://huggingface.co/IkariDev/Athena-v4) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_IkariDev__Athena-v4\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-25T17:00:26.530924](https://huggingface.co/datasets/open-llm-leaderboard/details_IkariDev__Athena-v4/blob/main/results_2023-10-25T17-00-26.530924.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.05432046979865772,\n\
\ \"em_stderr\": 0.002321097609357669,\n \"f1\": 0.13087562919463042,\n\
\ \"f1_stderr\": 0.0026936499511124616,\n \"acc\": 0.44229322757129275,\n\
\ \"acc_stderr\": 0.010432110783601959\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.05432046979865772,\n \"em_stderr\": 0.002321097609357669,\n\
\ \"f1\": 0.13087562919463042,\n \"f1_stderr\": 0.0026936499511124616\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1197877179681577,\n \
\ \"acc_stderr\": 0.008944213403553046\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7647987371744278,\n \"acc_stderr\": 0.011920008163650872\n\
\ }\n}\n```"
repo_url: https://huggingface.co/IkariDev/Athena-v4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|arc:challenge|25_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_25T17_00_26.530924
path:
- '**/details_harness|drop|3_2023-10-25T17-00-26.530924.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-25T17-00-26.530924.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_25T17_00_26.530924
path:
- '**/details_harness|gsm8k|5_2023-10-25T17-00-26.530924.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-25T17-00-26.530924.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hellaswag|10_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T09-29-40.768179.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T09-29-40.768179.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T09-29-40.768179.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_25T17_00_26.530924
path:
- '**/details_harness|winogrande|5_2023-10-25T17-00-26.530924.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-25T17-00-26.530924.parquet'
- config_name: results
data_files:
- split: 2023_10_09T09_29_40.768179
path:
- results_2023-10-09T09-29-40.768179.parquet
- split: 2023_10_25T17_00_26.530924
path:
- results_2023-10-25T17-00-26.530924.parquet
- split: latest
path:
- results_2023-10-25T17-00-26.530924.parquet
---
# Dataset Card for Evaluation run of IkariDev/Athena-v4
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/IkariDev/Athena-v4
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [IkariDev/Athena-v4](https://huggingface.co/IkariDev/Athena-v4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_IkariDev__Athena-v4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-25T17:00:26.530924](https://huggingface.co/datasets/open-llm-leaderboard/details_IkariDev__Athena-v4/blob/main/results_2023-10-25T17-00-26.530924.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.05432046979865772,
"em_stderr": 0.002321097609357669,
"f1": 0.13087562919463042,
"f1_stderr": 0.0026936499511124616,
"acc": 0.44229322757129275,
"acc_stderr": 0.010432110783601959
},
"harness|drop|3": {
"em": 0.05432046979865772,
"em_stderr": 0.002321097609357669,
"f1": 0.13087562919463042,
"f1_stderr": 0.0026936499511124616
},
"harness|gsm8k|5": {
"acc": 0.1197877179681577,
"acc_stderr": 0.008944213403553046
},
"harness|winogrande|5": {
"acc": 0.7647987371744278,
"acc_stderr": 0.011920008163650872
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,532 | [
[
-0.038543701171875,
-0.042724609375,
0.0210723876953125,
0.01175689697265625,
-0.0219879150390625,
0.0031223297119140625,
-0.018646240234375,
-0.019500732421875,
0.0294952392578125,
0.042449951171875,
-0.053619384765625,
-0.072265625,
-0.0555419921875,
0.018... |
cambridgeltl/posqa | 2023-10-23T09:14:31.000Z | [
"task_categories:text-classification",
"task_categories:question-answering",
"size_categories:1K<n<10K",
"language:en",
"license:apache-2.0",
"arxiv:2310.13394",
"region:us"
] | cambridgeltl | null | null | 0 | 0 | 2023-10-09T09:40:22 | ---
license: apache-2.0
task_categories:
- text-classification
- question-answering
language:
- en
size_categories:
- 1K<n<10K
---
This dataset is based on our publication *POSQA: Probe the World Models of LLMs with Size Comparisons* ([PDF](https://arxiv.org/abs/2310.13394)).
More details: [GitHub Repo](https://github.com/cambridgeltl/POSQA)
| 346 | [
[
-0.030731201171875,
-0.019195556640625,
0.05523681640625,
-0.00955963134765625,
-0.0114288330078125,
-0.022613525390625,
0.0287322998046875,
-0.006214141845703125,
0.01995849609375,
0.056396484375,
-0.0521240234375,
-0.043060302734375,
-0.0272369384765625,
0... |
bjoernp/laion-2b-mistral_captions-1.3M | 2023-10-09T10:00:11.000Z | [
"region:us"
] | bjoernp | null | null | 0 | 0 | 2023-10-09T09:58:51 | ---
dataset_info:
features:
- name: TEXT
dtype: string
- name: RESPONSE
dtype: string
- name: captions
sequence: string
splits:
- name: train
num_bytes: 853385896.3491833
num_examples: 1318108
download_size: 540262191
dataset_size: 853385896.3491833
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "laion-2b-mistral_captions-1.3M"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 560 | [
[
-0.023040771484375,
-0.0010232925415039062,
0.0102081298828125,
0.034759521484375,
-0.033203125,
-0.0215301513671875,
0.02313232421875,
-0.00501251220703125,
0.04351806640625,
0.0506591796875,
-0.049224853515625,
-0.040802001953125,
-0.04150390625,
-0.028656... |
suminlim/repo_name | 2023-10-09T10:42:36.000Z | [
"region:us"
] | suminlim | null | null | 0 | 0 | 2023-10-09T10:42:36 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
suminlim/iherb_items | 2023-10-09T10:42:56.000Z | [
"region:us"
] | suminlim | null | null | 0 | 0 | 2023-10-09T10:42:56 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
happylkx/InstructCoder | 2023-11-02T06:08:44.000Z | [
"task_categories:text-generation",
"size_categories:100K<n<1M",
"language:en",
"code",
"region:us"
] | happylkx | null | null | 2 | 0 | 2023-10-09T11:21:14 | ---
task_categories:
- text-generation
language:
- en
tags:
- code
pretty_name: instruct_coder
size_categories:
- 100K<n<1M
---
<div align="center">
<img src="https://github.com/Happylkx/InstructCoder/raw/main/docs/logo.png">
</div>
<div align="center">
<a href="https://github.com/qishenghu/CodeInstruct/blob/main/CodeInstruct.pdf">Paper</a> |
<a href="https://github.com/qishenghu/CodeInstruct">Code</a> |
<a href="https://happylkx.github.io/InstructCoder/">Blog</a>
<!-- <a href="https://blog.nus.edu.sg/kaixinli/2023/05/23/codeinstruct/">Blog</a> -->
</div>
<!-- | [Checkpoints](link_to_checkpoints) -->
# InstructCoder (CodeInstruct): Empowering Language Models to Edit Code
## Updates
- May 23, 2023: Paper, code and data released.
## Overview
InstructCoder is the first dataset designed to adapt LLMs for general code editing. It consists of 114,239 instruction-input-output triplets and covers multiple distinct code editing scenarios, generated by ChatGPT. LLaMA-33B finetuned on InstructCoder performs on par with ChatGPT on a real-world test set derived from GitHub commits.

In the ever-evolving world of software development, efficient and effective code editing plays a pivotal role. As the demands for faster development cycles and increased productivity continue to rise, harnessing the power of Large Language Models in code editing has emerged as a game-changer. This project aims to revolutionize code editing, empowering developers with intelligent tools and techniques.
Though the community has studied extensively on coding with LLMs, our experiments demonstrate that current LLMs still struggle for code editing. To this end, we curate a code editing dataset, dubbed InstructCoder, for improving and evaluating code editing abilities of LLMs. InstructCoder is an instructional dataset containing diverse code-editing tasks, covering 20 scenarios where code editing may be performed. [The dataset is released here.](https://github.com/Happylkx/InstructCoder) Our empirical experiments reveal that open-source LLMs display notable gains in code editing abilities after finetuning on InstructCoder. Some qualitative examples are provided.
## Data Collection
InstructCoder is systematically collected through an iterative process. Initial seed tasks are selected from GitHub commits. Inspired by Self-Instruct, we prompt ChatGPT to bootstrap new instructions. Then, plausible scenarios where the filtered instructions may be used are generated. Finally, corresponding code input and output are obtained conditioned on both the instruction and scenario. High-quality samples are manually selected and recurrently added to the task pool for further generation.
## Results
The largest model fine-tuned, LLaMA-33B, performs on par with ChatGPT, with 79.3% accuracy on the test set. Some qualitative examples generated by our model are shown here.
## Release
We are planning to release the following assets:
- [x] Full dataset
- [x] Source Code
<!-- The LoRA checkpoints are available [here](link_to_checkpoints). -->
## Qualitative Examples
In this section, we showcase three qualitative examples generated by the LLaMA-33B model.
1. **Instruction:** Transform the code into a class method called ‘calculate_distance’ that calculates the distance between two points in 3D space.
**Input code:**
```python
import math
def distance(point1, point2):
x1, y1, z1 = point1
x2, y2, z2 = point2
dx = x2 - x1
dy = y2 - y1
dz = z2 - z1
return math.sqrt(dx**2 + dy**2 + dz**2)
point1 = (0, 0, 0)
point2 = (1, 1, 1)
print(distance(point1, point2))
```
Output code:
```python
import math
class Point:
def __init__(self, x, y, z):
self.x = x
self.y = y
self.z = z
def calculate_distance(self, other_point):
dx = other_point.x - self.x
dy = other_point.y - self.y
dz = other_point.z - self.z
return math.sqrt(dx**2 + dy**2 + dz**2)
point1 = Point(0, 0, 0)
point2 = Point(1, 1, 1)
print(point1.calculate_distance(point2))
```
2. Instruction: Integrate error handling into the code that performs batch processing of large datasets, including job failures or incomplete runs.
Input code:
```python
import pandas as pd
def process_orders(orders_file, shipping_file, inventory_file):
orders = pd.read_csv(orders_file)
shipping = pd.read_csv(shipping_file)
inventory = pd.read_csv(inventory_file)
# Perform data processing and analysis here...
results = {"success": True, "message": "Batch processing completed successfully."}
return results
orders_file = "orders.csv"
shipping_file = "shipping.csv"
inventory_file = "inventory.csv"
results = process_orders(orders_file, shipping_file, inventory_file)
print(results)
```
Output code:
```python
import pandas as pd
import logging
logging.basicConfig(filename='batch_processing.log', level=logging.ERROR)
def process_orders(orders_file, shipping_file, inventory_file):
try:
orders = pd.read_csv(orders_file)
shipping = pd.read_csv(shipping_file)
inventory = pd.read_csv(inventory_file)
# Perform data processing and analysis here...
results = {"success": True, "message": "Batch processing completed successfully."}
return results
except Exception as e:
logging.error(f"Error processing batch: {e}")
results = {"success": False, "message": "Batch processing failed. Please check the input files for any issues."}
return results
orders_file = "orders.csv"
shipping_file = "shipping.csv"
inventory_file = "inventory.csv"
results = process_orders(orders_file, shipping_file, inventory_file)
print(results)
```
3. Instruction: Refactor any code that uses ‘exec’ or ‘eval’ for security reasons.
Input code:
```python
Copy code
game_data = {
"score": 0,
"level": 1
}
def execute_script(script):
exec(script)
script = """
game_data["score"] += 10
if game_data["score"] >= 100:
game_data["level"] += 1
"""
execute_script(script)
print(game_data)
```
Output code:
```python
game_data = {
"score": 0,
"level": 1
}
def execute_script(script):
compiled_script = compile(script, "", "exec")
exec(compiled_script, game_data)
script = """
game_data["score"] += 10
if game_data["score"] >= 100:
game_data["level"] += 1
"""
execute_script(script)
print(game_data)
```
Now the script is not able to access the data in game_data.
## Limitations and Safety Concerns
By recognizing and addressing limitations and safety concerns, we hope developers can foster responsible AI deployment and use, promoting a more ethical, equitable, and secure future for large language models in code production.
1. **License:** You must ensure compliance with the corresponding licenses of the LLMs. It is important to thoroughly review the licenses and make sure that your usage is in accordance with their terms. For instance, you are not allowed to use LLaMA commercially, for it is released under a noncommercial license.
2. **Incomplete or Imperfect Knowledge:** LLMs are trained on vast amounts of data, which may not always be up-to-date or entirely accurate. For example, the APIs of a library may change over time. Consequently, the information provided by the models could be outdated, inaccurate, or even misleading in some instances.
3. **Overuse and Dependency:** Users might incorrectly interpret or rely too heavily on the outputs generated by large language models. It is crucial to provide proper guidance and promote an understanding of the model’s limitations, encouraging users to critically assess and verify the information or suggestions provided. Please make sure to check the generation of the models before using them.
Overreliance on large language models could lead to complacency, potentially causing users to undervalue human intelligence, such as creativity and critical thinking. We encourage users to use AI as a tool to supplement, rather than replace, human input and judgment.
4. **Malicious Use:** There is a risk that malicious actors might use the tools for nefarious purposes, such as generating malicious software. It is important to monitor the use and deployment of these models, track and report abuse, and develop countermeasures to address potential malicious activity.
5. **Bias and Discrimination:** Language models can inherit societal biases present in their training data, possibly leading to discriminatory or biased generations. Though our dataset is not likely to contain such toxic data, they may appear in the responses because of the base LLMs.
## Citation
Feel free to cite our work if you find it interesting or use the data:
```plain
@misc{2023instructcoder,
title={InstructCoder: Empowering Language Models to Edit Code},
url={https://github.com/qishenghu/CodeInstruct},
year = {2023},
publisher = {GitHub},
journal = {GitHub repository},
author={Hu, Qisheng and Li, Kaixin and Zhao, Xu and Xie, Yuxi and Liu, Tiedong and Chen, Hui and Xie, Qizhe and He, Junxian}}
```
## Conclusion
The integration of AI into code editing represents a significant milestone in the evolution of software development. By leveraging AI’s capabilities in understanding code semantics, patterns, and best practices, developers can unlock new levels of productivity, code quality, and efficiency. This project we’ve explored demonstrates the immense potential of intelligent code editing tools. As the software development landscape continues to evolve, embracing AI is poised to become a standard practice, and sets the stage for a future where developers can focus more on creativity and problem-solving, while AI handles the mundane aspects of coding.
| 9,726 | [
[
-0.01371002197265625,
-0.06036376953125,
0.02923583984375,
0.0199432373046875,
0.00165557861328125,
0.00435638427734375,
-0.01084136962890625,
-0.0408935546875,
-0.004901885986328125,
0.03485107421875,
-0.035919189453125,
-0.055999755859375,
-0.035369873046875,
... |
open-llm-leaderboard/details_migtissera__SynthIA-7B-v1.3 | 2023-10-28T14:24:32.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-09T11:48:41 | ---
pretty_name: Evaluation run of migtissera/SynthIA-7B-v1.3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [migtissera/SynthIA-7B-v1.3](https://huggingface.co/migtissera/SynthIA-7B-v1.3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_migtissera__SynthIA-7B-v1.3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-28T14:24:19.449160](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__SynthIA-7B-v1.3/blob/main/results_2023-10-28T14-24-19.449160.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.34375,\n \
\ \"em_stderr\": 0.004864023482291936,\n \"f1\": 0.43760067114094225,\n\
\ \"f1_stderr\": 0.004666454920595155,\n \"acc\": 0.4821837715185681,\n\
\ \"acc_stderr\": 0.010982434159881403\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.34375,\n \"em_stderr\": 0.004864023482291936,\n \
\ \"f1\": 0.43760067114094225,\n \"f1_stderr\": 0.004666454920595155\n \
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.17589082638362397,\n \
\ \"acc_stderr\": 0.010487120635539617\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7884767166535123,\n \"acc_stderr\": 0.011477747684223188\n\
\ }\n}\n```"
repo_url: https://huggingface.co/migtissera/SynthIA-7B-v1.3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|arc:challenge|25_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|arc:challenge|25_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_28T14_24_19.449160
path:
- '**/details_harness|drop|3_2023-10-28T14-24-19.449160.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-28T14-24-19.449160.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_28T14_24_19.449160
path:
- '**/details_harness|gsm8k|5_2023-10-28T14-24-19.449160.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-28T14-24-19.449160.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hellaswag|10_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hellaswag|10_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T11-48-18.823660.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T11-58-55.532772.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T11-48-18.823660.parquet'
- split: 2023_10_09T11_58_55.532772
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T11-58-55.532772.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T11-58-55.532772.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_28T14_24_19.449160
path:
- '**/details_harness|winogrande|5_2023-10-28T14-24-19.449160.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-28T14-24-19.449160.parquet'
- config_name: results
data_files:
- split: 2023_10_09T11_48_18.823660
path:
- results_2023-10-09T11-48-18.823660.parquet
- split: 2023_10_09T11_58_55.532772
path:
- results_2023-10-09T11-58-55.532772.parquet
- split: 2023_10_28T14_24_19.449160
path:
- results_2023-10-28T14-24-19.449160.parquet
- split: latest
path:
- results_2023-10-28T14-24-19.449160.parquet
---
# Dataset Card for Evaluation run of migtissera/SynthIA-7B-v1.3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/migtissera/SynthIA-7B-v1.3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [migtissera/SynthIA-7B-v1.3](https://huggingface.co/migtissera/SynthIA-7B-v1.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_migtissera__SynthIA-7B-v1.3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-28T14:24:19.449160](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__SynthIA-7B-v1.3/blob/main/results_2023-10-28T14-24-19.449160.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.34375,
"em_stderr": 0.004864023482291936,
"f1": 0.43760067114094225,
"f1_stderr": 0.004666454920595155,
"acc": 0.4821837715185681,
"acc_stderr": 0.010982434159881403
},
"harness|drop|3": {
"em": 0.34375,
"em_stderr": 0.004864023482291936,
"f1": 0.43760067114094225,
"f1_stderr": 0.004666454920595155
},
"harness|gsm8k|5": {
"acc": 0.17589082638362397,
"acc_stderr": 0.010487120635539617
},
"harness|winogrande|5": {
"acc": 0.7884767166535123,
"acc_stderr": 0.011477747684223188
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 52,803 | [
[
-0.0345458984375,
-0.04608154296875,
0.0247802734375,
0.0190887451171875,
-0.0154266357421875,
0.0087127685546875,
-0.026153564453125,
-0.0157470703125,
0.032135009765625,
0.035247802734375,
-0.049346923828125,
-0.07684326171875,
-0.04840087890625,
0.0185241... |
open-llm-leaderboard/details_ehartford__dolphin-2.0-mistral-7b | 2023-10-29T11:13:21.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-09T12:06:50 | ---
pretty_name: Evaluation run of ehartford/dolphin-2.0-mistral-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ehartford/dolphin-2.0-mistral-7b](https://huggingface.co/ehartford/dolphin-2.0-mistral-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ehartford__dolphin-2.0-mistral-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-29T11:13:09.242733](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__dolphin-2.0-mistral-7b/blob/main/results_2023-10-29T11-13-09.242733.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.32843959731543626,\n\
\ \"em_stderr\": 0.0048096109452043685,\n \"f1\": 0.3948563338926188,\n\
\ \"f1_stderr\": 0.004687030417639075,\n \"acc\": 0.47012697069663045,\n\
\ \"acc_stderr\": 0.011418752673563709\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.32843959731543626,\n \"em_stderr\": 0.0048096109452043685,\n\
\ \"f1\": 0.3948563338926188,\n \"f1_stderr\": 0.004687030417639075\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1865049279757392,\n \
\ \"acc_stderr\": 0.010729140039689892\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7537490134175217,\n \"acc_stderr\": 0.012108365307437524\n\
\ }\n}\n```"
repo_url: https://huggingface.co/ehartford/dolphin-2.0-mistral-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|arc:challenge|25_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_29T11_13_09.242733
path:
- '**/details_harness|drop|3_2023-10-29T11-13-09.242733.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-29T11-13-09.242733.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_29T11_13_09.242733
path:
- '**/details_harness|gsm8k|5_2023-10-29T11-13-09.242733.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-29T11-13-09.242733.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hellaswag|10_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-06-26.268228.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T12-06-26.268228.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T12-06-26.268228.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_29T11_13_09.242733
path:
- '**/details_harness|winogrande|5_2023-10-29T11-13-09.242733.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-29T11-13-09.242733.parquet'
- config_name: results
data_files:
- split: 2023_10_09T12_06_26.268228
path:
- results_2023-10-09T12-06-26.268228.parquet
- split: 2023_10_29T11_13_09.242733
path:
- results_2023-10-29T11-13-09.242733.parquet
- split: latest
path:
- results_2023-10-29T11-13-09.242733.parquet
---
# Dataset Card for Evaluation run of ehartford/dolphin-2.0-mistral-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ehartford/dolphin-2.0-mistral-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ehartford/dolphin-2.0-mistral-7b](https://huggingface.co/ehartford/dolphin-2.0-mistral-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ehartford__dolphin-2.0-mistral-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-29T11:13:09.242733](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__dolphin-2.0-mistral-7b/blob/main/results_2023-10-29T11-13-09.242733.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.32843959731543626,
"em_stderr": 0.0048096109452043685,
"f1": 0.3948563338926188,
"f1_stderr": 0.004687030417639075,
"acc": 0.47012697069663045,
"acc_stderr": 0.011418752673563709
},
"harness|drop|3": {
"em": 0.32843959731543626,
"em_stderr": 0.0048096109452043685,
"f1": 0.3948563338926188,
"f1_stderr": 0.004687030417639075
},
"harness|gsm8k|5": {
"acc": 0.1865049279757392,
"acc_stderr": 0.010729140039689892
},
"harness|winogrande|5": {
"acc": 0.7537490134175217,
"acc_stderr": 0.012108365307437524
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,696 | [
[
-0.036468505859375,
-0.045806884765625,
0.0159759521484375,
0.01151275634765625,
-0.01392364501953125,
-0.00008684396743774414,
-0.02081298828125,
-0.0216827392578125,
0.0288543701171875,
0.04290771484375,
-0.049346923828125,
-0.062103271484375,
-0.0499572753906... |
open-llm-leaderboard/details_ehartford__samantha-mistral-7b | 2023-10-29T01:10:50.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-09T12:11:42 | ---
pretty_name: Evaluation run of ehartford/samantha-mistral-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ehartford/samantha-mistral-7b](https://huggingface.co/ehartford/samantha-mistral-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ehartford__samantha-mistral-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-29T01:10:37.829717](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__samantha-mistral-7b/blob/main/results_2023-10-29T01-10-37.829717.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.056732382550335574,\n\
\ \"em_stderr\": 0.0023690412638350568,\n \"f1\": 0.11221791107382512,\n\
\ \"f1_stderr\": 0.0025837884585332253,\n \"acc\": 0.4639627375502118,\n\
\ \"acc_stderr\": 0.010980763759790235\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.056732382550335574,\n \"em_stderr\": 0.0023690412638350568,\n\
\ \"f1\": 0.11221791107382512,\n \"f1_stderr\": 0.0025837884585332253\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1599696739954511,\n \
\ \"acc_stderr\": 0.010097377827752538\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7679558011049724,\n \"acc_stderr\": 0.011864149691827934\n\
\ }\n}\n```"
repo_url: https://huggingface.co/ehartford/samantha-mistral-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|arc:challenge|25_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_29T01_10_37.829717
path:
- '**/details_harness|drop|3_2023-10-29T01-10-37.829717.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-29T01-10-37.829717.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_29T01_10_37.829717
path:
- '**/details_harness|gsm8k|5_2023-10-29T01-10-37.829717.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-29T01-10-37.829717.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hellaswag|10_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-11-18.939016.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T12-11-18.939016.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T12-11-18.939016.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_29T01_10_37.829717
path:
- '**/details_harness|winogrande|5_2023-10-29T01-10-37.829717.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-29T01-10-37.829717.parquet'
- config_name: results
data_files:
- split: 2023_10_09T12_11_18.939016
path:
- results_2023-10-09T12-11-18.939016.parquet
- split: 2023_10_29T01_10_37.829717
path:
- results_2023-10-29T01-10-37.829717.parquet
- split: latest
path:
- results_2023-10-29T01-10-37.829717.parquet
---
# Dataset Card for Evaluation run of ehartford/samantha-mistral-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ehartford/samantha-mistral-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ehartford/samantha-mistral-7b](https://huggingface.co/ehartford/samantha-mistral-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ehartford__samantha-mistral-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-29T01:10:37.829717](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__samantha-mistral-7b/blob/main/results_2023-10-29T01-10-37.829717.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.056732382550335574,
"em_stderr": 0.0023690412638350568,
"f1": 0.11221791107382512,
"f1_stderr": 0.0025837884585332253,
"acc": 0.4639627375502118,
"acc_stderr": 0.010980763759790235
},
"harness|drop|3": {
"em": 0.056732382550335574,
"em_stderr": 0.0023690412638350568,
"f1": 0.11221791107382512,
"f1_stderr": 0.0025837884585332253
},
"harness|gsm8k|5": {
"acc": 0.1599696739954511,
"acc_stderr": 0.010097377827752538
},
"harness|winogrande|5": {
"acc": 0.7679558011049724,
"acc_stderr": 0.011864149691827934
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,670 | [
[
-0.025604248046875,
-0.04705810546875,
0.020965576171875,
0.00991058349609375,
-0.00926971435546875,
0.0035305023193359375,
-0.022216796875,
-0.01308441162109375,
0.0276336669921875,
0.04034423828125,
-0.04962158203125,
-0.0712890625,
-0.049224853515625,
0.0... |
open-llm-leaderboard/details_ehartford__samantha-mistral-instruct-7b | 2023-10-29T11:08:17.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-09T12:17:49 | ---
pretty_name: Evaluation run of ehartford/samantha-mistral-instruct-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ehartford/samantha-mistral-instruct-7b](https://huggingface.co/ehartford/samantha-mistral-instruct-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ehartford__samantha-mistral-instruct-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-29T11:08:05.162648](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__samantha-mistral-instruct-7b/blob/main/results_2023-10-29T11-08-05.162648.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.31291946308724833,\n\
\ \"em_stderr\": 0.004748536304260034,\n \"f1\": 0.36725566275167865,\n\
\ \"f1_stderr\": 0.0046625848085346845,\n \"acc\": 0.4062203613868821,\n\
\ \"acc_stderr\": 0.010696600366483247\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.31291946308724833,\n \"em_stderr\": 0.004748536304260034,\n\
\ \"f1\": 0.36725566275167865,\n \"f1_stderr\": 0.0046625848085346845\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10841546626231995,\n \
\ \"acc_stderr\": 0.008563852506627485\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7040252565114443,\n \"acc_stderr\": 0.012829348226339011\n\
\ }\n}\n```"
repo_url: https://huggingface.co/ehartford/samantha-mistral-instruct-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|arc:challenge|25_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_29T11_08_05.162648
path:
- '**/details_harness|drop|3_2023-10-29T11-08-05.162648.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-29T11-08-05.162648.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_29T11_08_05.162648
path:
- '**/details_harness|gsm8k|5_2023-10-29T11-08-05.162648.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-29T11-08-05.162648.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hellaswag|10_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-17-25.772796.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T12-17-25.772796.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T12-17-25.772796.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_29T11_08_05.162648
path:
- '**/details_harness|winogrande|5_2023-10-29T11-08-05.162648.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-29T11-08-05.162648.parquet'
- config_name: results
data_files:
- split: 2023_10_09T12_17_25.772796
path:
- results_2023-10-09T12-17-25.772796.parquet
- split: 2023_10_29T11_08_05.162648
path:
- results_2023-10-29T11-08-05.162648.parquet
- split: latest
path:
- results_2023-10-29T11-08-05.162648.parquet
---
# Dataset Card for Evaluation run of ehartford/samantha-mistral-instruct-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ehartford/samantha-mistral-instruct-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ehartford/samantha-mistral-instruct-7b](https://huggingface.co/ehartford/samantha-mistral-instruct-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ehartford__samantha-mistral-instruct-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-29T11:08:05.162648](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__samantha-mistral-instruct-7b/blob/main/results_2023-10-29T11-08-05.162648.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.31291946308724833,
"em_stderr": 0.004748536304260034,
"f1": 0.36725566275167865,
"f1_stderr": 0.0046625848085346845,
"acc": 0.4062203613868821,
"acc_stderr": 0.010696600366483247
},
"harness|drop|3": {
"em": 0.31291946308724833,
"em_stderr": 0.004748536304260034,
"f1": 0.36725566275167865,
"f1_stderr": 0.0046625848085346845
},
"harness|gsm8k|5": {
"acc": 0.10841546626231995,
"acc_stderr": 0.008563852506627485
},
"harness|winogrande|5": {
"acc": 0.7040252565114443,
"acc_stderr": 0.012829348226339011
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,772 | [
[
-0.0235443115234375,
-0.04705810546875,
0.0193634033203125,
0.0070037841796875,
-0.007568359375,
0.0009503364562988281,
-0.0222015380859375,
-0.01393890380859375,
0.02667236328125,
0.039794921875,
-0.051025390625,
-0.0699462890625,
-0.0479736328125,
0.013442... |
dmrau/cqudubstack-android | 2023-10-09T12:19:38.000Z | [
"region:us"
] | dmrau | null | null | 0 | 0 | 2023-10-09T12:19:34 | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: queries
num_bytes: 47953
num_examples: 699
- name: corpus
num_bytes: 12840959
num_examples: 22998
download_size: 7657118
dataset_size: 12888912
---
# Dataset Card for "cqudubstack-android"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 624 | [
[
-0.040557861328125,
-0.01493072509765625,
0.017364501953125,
0.0222625732421875,
-0.033172607421875,
0.0286102294921875,
0.0302886962890625,
-0.0190582275390625,
0.06982421875,
0.042144775390625,
-0.0640869140625,
-0.050048828125,
-0.0275421142578125,
-0.021... |
dmrau/cqadubstack-android-qrels | 2023-10-09T12:19:39.000Z | [
"region:us"
] | dmrau | null | null | 0 | 0 | 2023-10-09T12:19:38 | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: query-id
dtype: string
- name: corpus-id
dtype: string
- name: score
dtype: int64
splits:
- name: test
num_bytes: 43411
num_examples: 1696
download_size: 19993
dataset_size: 43411
---
# Dataset Card for "cqadubstack-android-qrels"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 519 | [
[
-0.045440673828125,
0.00707244873046875,
0.0161590576171875,
0.0140228271484375,
-0.0282745361328125,
0.03228759765625,
0.03607177734375,
-0.00714874267578125,
0.05230712890625,
0.037384033203125,
-0.06134033203125,
-0.0472412109375,
-0.02691650390625,
-0.01... |
dmrau/cqudubstack-gaming | 2023-10-09T12:19:52.000Z | [
"region:us"
] | dmrau | null | null | 0 | 0 | 2023-10-09T12:19:48 | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: queries
num_bytes: 105494
num_examples: 1595
- name: corpus
num_bytes: 20666596
num_examples: 45301
download_size: 12946080
dataset_size: 20772090
---
# Dataset Card for "cqudubstack-gaming"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 626 | [
[
-0.04522705078125,
-0.0296478271484375,
0.0108795166015625,
0.02825927734375,
-0.0163421630859375,
0.0235748291015625,
0.02276611328125,
-0.00968170166015625,
0.0606689453125,
0.03070068359375,
-0.07318115234375,
-0.057220458984375,
-0.023162841796875,
-0.03... |
dmrau/cqadubstack-gaming-qrels | 2023-10-09T12:19:53.000Z | [
"region:us"
] | dmrau | null | null | 0 | 0 | 2023-10-09T12:19:52 | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: query-id
dtype: string
- name: corpus-id
dtype: string
- name: score
dtype: int64
splits:
- name: test
num_bytes: 60520
num_examples: 2263
download_size: 32524
dataset_size: 60520
---
# Dataset Card for "cqadubstack-gaming-qrels"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 518 | [
[
-0.0498046875,
-0.007038116455078125,
0.0114593505859375,
0.0197601318359375,
-0.0148773193359375,
0.02783203125,
0.0281982421875,
-0.00012803077697753906,
0.04632568359375,
0.0266571044921875,
-0.06646728515625,
-0.05364990234375,
-0.0244598388671875,
-0.02... |
dmrau/cqudubstack-mathematica | 2023-10-09T12:20:00.000Z | [
"region:us"
] | dmrau | null | null | 0 | 0 | 2023-10-09T12:19:57 | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: queries
num_bytes: 52792
num_examples: 804
- name: corpus
num_bytes: 18735825
num_examples: 16705
download_size: 10393860
dataset_size: 18788617
---
# Dataset Card for "cqudubstack-mathematica"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 629 | [
[
-0.03643798828125,
-0.0164947509765625,
0.004840850830078125,
0.024017333984375,
-0.0149993896484375,
0.033966064453125,
0.0207366943359375,
-0.014373779296875,
0.0601806640625,
0.0338134765625,
-0.069580078125,
-0.05950927734375,
-0.030487060546875,
-0.0273... |
dmrau/cqadubstack-mathematica-qrels | 2023-10-09T12:20:02.000Z | [
"region:us"
] | dmrau | null | null | 0 | 0 | 2023-10-09T12:20:01 | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: query-id
dtype: string
- name: corpus-id
dtype: string
- name: score
dtype: int64
splits:
- name: test
num_bytes: 34691
num_examples: 1358
download_size: 18181
dataset_size: 34691
---
# Dataset Card for "cqadubstack-mathematica-qrels"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 523 | [
[
-0.04205322265625,
0.005847930908203125,
0.00731658935546875,
0.0196533203125,
-0.0145263671875,
0.033538818359375,
0.0242919921875,
-0.005939483642578125,
0.045867919921875,
0.02618408203125,
-0.06451416015625,
-0.053680419921875,
-0.0270843505859375,
-0.01... |
dmrau/cqudubstack-programmers | 2023-10-09T12:20:11.000Z | [
"region:us"
] | dmrau | null | null | 0 | 0 | 2023-10-09T12:20:06 | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: queries
num_bytes: 63785
num_examples: 876
- name: corpus
num_bytes: 32727262
num_examples: 32176
download_size: 19360000
dataset_size: 32791047
---
# Dataset Card for "cqudubstack-programmers"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 629 | [
[
-0.0380859375,
-0.01406097412109375,
0.01082611083984375,
0.0301055908203125,
-0.004604339599609375,
0.0357666015625,
0.0175628662109375,
-0.01328277587890625,
0.05902099609375,
0.03851318359375,
-0.05859375,
-0.053558349609375,
-0.030303955078125,
-0.025787... |
dmrau/cqadubstack-programmers-qrels | 2023-10-09T12:20:13.000Z | [
"region:us"
] | dmrau | null | null | 0 | 0 | 2023-10-09T12:20:11 | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: query-id
dtype: string
- name: corpus-id
dtype: string
- name: score
dtype: int64
splits:
- name: test
num_bytes: 45452
num_examples: 1675
download_size: 22632
dataset_size: 45452
---
# Dataset Card for "cqadubstack-programmers-qrels"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 523 | [
[
-0.0443115234375,
0.00753021240234375,
0.01263427734375,
0.0205230712890625,
-0.00689697265625,
0.037109375,
0.0252227783203125,
-0.004238128662109375,
0.04541015625,
0.031951904296875,
-0.056549072265625,
-0.048919677734375,
-0.0279693603515625,
-0.01762390... |
dmrau/cqudubstack-tex | 2023-10-09T12:21:06.000Z | [
"region:us"
] | dmrau | null | null | 0 | 0 | 2023-10-09T12:21:00 | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: queries
num_bytes: 186934
num_examples: 2906
- name: corpus
num_bytes: 86600423
num_examples: 68184
download_size: 43424126
dataset_size: 86787357
---
# Dataset Card for "cqudubstack-tex"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 623 | [
[
-0.03704833984375,
-0.02154541015625,
0.02178955078125,
0.019012451171875,
-0.019012451171875,
0.03265380859375,
0.020111083984375,
-0.016632080078125,
0.058929443359375,
0.041259765625,
-0.055084228515625,
-0.06292724609375,
-0.0367431640625,
-0.02359008789... |
dmrau/cqadubstack-tex-qrels | 2023-10-09T12:21:08.000Z | [
"region:us"
] | dmrau | null | null | 0 | 0 | 2023-10-09T12:21:07 | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: query-id
dtype: string
- name: corpus-id
dtype: string
- name: score
dtype: int64
splits:
- name: test
num_bytes: 137572
num_examples: 5154
download_size: 67107
dataset_size: 137572
---
# Dataset Card for "cqadubstack-tex-qrels"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 517 | [
[
-0.04193115234375,
0.0001361370086669922,
0.0199432373046875,
0.01309967041015625,
-0.0159912109375,
0.036041259765625,
0.025787353515625,
-0.006992340087890625,
0.044219970703125,
0.03509521484375,
-0.050933837890625,
-0.058502197265625,
-0.034332275390625,
... |
dmrau/cqudubstack-webmasters | 2023-10-09T12:21:13.000Z | [
"region:us"
] | dmrau | null | null | 0 | 0 | 2023-10-09T12:21:10 | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: queries
num_bytes: 34792
num_examples: 506
- name: corpus
num_bytes: 11659413
num_examples: 17405
download_size: 6885106
dataset_size: 11694205
---
# Dataset Card for "cqudubstack-webmasters"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 627 | [
[
-0.043792724609375,
-0.010101318359375,
-0.00439453125,
0.0276641845703125,
-0.01342010498046875,
0.0278778076171875,
0.01480865478515625,
-0.017791748046875,
0.049835205078125,
0.046630859375,
-0.0638427734375,
-0.050750732421875,
-0.0288543701171875,
-0.01... |
dmrau/cqadubstack-webmasters-qrels | 2023-10-09T12:21:14.000Z | [
"region:us"
] | dmrau | null | null | 0 | 0 | 2023-10-09T12:21:13 | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: query-id
dtype: string
- name: corpus-id
dtype: string
- name: score
dtype: int64
splits:
- name: test
num_bytes: 35771
num_examples: 1395
download_size: 16248
dataset_size: 35771
---
# Dataset Card for "cqadubstack-webmasters-qrels"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 522 | [
[
-0.049163818359375,
0.00804901123046875,
0.0003361701965332031,
0.018707275390625,
-0.0141754150390625,
0.027862548828125,
0.020843505859375,
-0.006816864013671875,
0.04168701171875,
0.033782958984375,
-0.0645751953125,
-0.049957275390625,
-0.02587890625,
-0... |
dmrau/cqudubstack-english | 2023-10-09T12:21:27.000Z | [
"region:us"
] | dmrau | null | null | 0 | 0 | 2023-10-09T12:21:22 | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: queries
num_bytes: 103588
num_examples: 1570
- name: corpus
num_bytes: 18199570
num_examples: 40221
download_size: 11382247
dataset_size: 18303158
---
# Dataset Card for "cqudubstack-english"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 627 | [
[
-0.0255279541015625,
-0.0210113525390625,
0.00627899169921875,
0.03277587890625,
-0.0283203125,
0.026275634765625,
0.00030112266540527344,
-0.0241241455078125,
0.07037353515625,
0.03582763671875,
-0.053192138671875,
-0.062225341796875,
-0.0374755859375,
-0.0... |
dmrau/cqadubstack-english-qrels | 2023-10-09T12:21:28.000Z | [
"region:us"
] | dmrau | null | null | 0 | 0 | 2023-10-09T12:21:27 | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: query-id
dtype: string
- name: corpus-id
dtype: string
- name: score
dtype: int64
splits:
- name: test
num_bytes: 100171
num_examples: 3765
download_size: 45031
dataset_size: 100171
---
# Dataset Card for "cqadubstack-english-qrels"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 521 | [
[
-0.036712646484375,
0.0025997161865234375,
0.00820159912109375,
0.0215606689453125,
-0.01959228515625,
0.0286712646484375,
0.0133209228515625,
-0.01177215576171875,
0.05218505859375,
0.03338623046875,
-0.051849365234375,
-0.05584716796875,
-0.034271240234375,
... |
dmrau/cqudubstack-gis | 2023-10-09T12:21:37.000Z | [
"region:us"
] | dmrau | null | null | 0 | 0 | 2023-10-09T12:21:32 | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: queries
num_bytes: 61244
num_examples: 885
- name: corpus
num_bytes: 36704924
num_examples: 37637
download_size: 20083359
dataset_size: 36766168
---
# Dataset Card for "cqudubstack-gis"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 621 | [
[
-0.04034423828125,
-0.0157928466796875,
0.01300048828125,
0.0084228515625,
-0.0210418701171875,
0.0295257568359375,
0.021240234375,
-0.006420135498046875,
0.06781005859375,
0.029632568359375,
-0.05450439453125,
-0.064453125,
-0.043548583984375,
-0.0285034179... |
dmrau/cqadubstack-gis-qrels | 2023-10-09T12:21:39.000Z | [
"region:us"
] | dmrau | null | null | 0 | 0 | 2023-10-09T12:21:37 | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: query-id
dtype: string
- name: corpus-id
dtype: string
- name: score
dtype: int64
splits:
- name: test
num_bytes: 28952
num_examples: 1114
download_size: 17234
dataset_size: 28952
---
# Dataset Card for "cqadubstack-gis-qrels"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 515 | [
[
-0.042999267578125,
0.0028972625732421875,
0.01384735107421875,
0.004329681396484375,
-0.0188446044921875,
0.0303192138671875,
0.027008056640625,
0.000006377696990966797,
0.051483154296875,
0.0276031494140625,
-0.0531005859375,
-0.05975341796875,
-0.038818359375... |
dmrau/cqudubstack-physics | 2023-10-09T12:21:48.000Z | [
"region:us"
] | dmrau | null | null | 0 | 0 | 2023-10-09T12:21:43 | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: queries
num_bytes: 73255
num_examples: 1039
- name: corpus
num_bytes: 29949928
num_examples: 38316
download_size: 17827262
dataset_size: 30023183
---
# Dataset Card for "cqudubstack-physics"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 626 | [
[
-0.0302276611328125,
-0.0160675048828125,
0.022613525390625,
0.0210418701171875,
-0.0170745849609375,
0.0258026123046875,
0.0272064208984375,
-0.0123291015625,
0.059967041015625,
0.016571044921875,
-0.058502197265625,
-0.038055419921875,
-0.0245819091796875,
... |
dmrau/cqadubstack-physics-qrels | 2023-10-09T12:21:50.000Z | [
"region:us"
] | dmrau | null | null | 0 | 0 | 2023-10-09T12:21:49 | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: query-id
dtype: string
- name: corpus-id
dtype: string
- name: score
dtype: int64
splits:
- name: test
num_bytes: 50809
num_examples: 1933
download_size: 25022
dataset_size: 50809
---
# Dataset Card for "cqadubstack-physics-qrels"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 519 | [
[
-0.035797119140625,
0.00530242919921875,
0.02203369140625,
0.0141754150390625,
-0.01546478271484375,
0.0282135009765625,
0.033111572265625,
-0.002285003662109375,
0.046295166015625,
0.0132598876953125,
-0.05548095703125,
-0.036224365234375,
-0.0251007080078125,
... |
dmrau/cqudubstack-stats | 2023-10-09T12:21:59.000Z | [
"region:us"
] | dmrau | null | null | 0 | 0 | 2023-10-09T12:21:53 | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: queries
num_bytes: 47795
num_examples: 652
- name: corpus
num_bytes: 42923933
num_examples: 42269
download_size: 24679799
dataset_size: 42971728
---
# Dataset Card for "cqudubstack-stats"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 623 | [
[
-0.034698486328125,
-0.017425537109375,
0.0111846923828125,
0.0226593017578125,
-0.0205535888671875,
0.0207977294921875,
0.0251007080078125,
-0.01045989990234375,
0.0721435546875,
0.034210205078125,
-0.058807373046875,
-0.056121826171875,
-0.025482177734375,
... |
dmrau/cqadubstack-stats-qrels | 2023-10-09T12:22:01.000Z | [
"region:us"
] | dmrau | null | null | 0 | 0 | 2023-10-09T12:21:59 | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: query-id
dtype: string
- name: corpus-id
dtype: string
- name: score
dtype: int64
splits:
- name: test
num_bytes: 23665
num_examples: 913
download_size: 13316
dataset_size: 23665
---
# Dataset Card for "cqadubstack-stats-qrels"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 516 | [
[
-0.04046630859375,
0.0036869049072265625,
0.0108795166015625,
0.0161285400390625,
-0.0181884765625,
0.0233917236328125,
0.0301361083984375,
-0.0003578662872314453,
0.05474853515625,
0.0286407470703125,
-0.0560302734375,
-0.050750732421875,
-0.0254974365234375,
... |
dmrau/cqudubstack-unix | 2023-10-09T12:22:11.000Z | [
"region:us"
] | dmrau | null | null | 0 | 0 | 2023-10-09T12:22:06 | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: queries
num_bytes: 72357
num_examples: 1072
- name: corpus
num_bytes: 46102756
num_examples: 47382
download_size: 24571026
dataset_size: 46175113
---
# Dataset Card for "cqudubstack-unix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 623 | [
[
-0.034149169921875,
-0.0179901123046875,
0.0248260498046875,
0.013397216796875,
-0.0252838134765625,
0.029144287109375,
0.022705078125,
-0.00713348388671875,
0.0648193359375,
0.048797607421875,
-0.059906005859375,
-0.0548095703125,
-0.027496337890625,
-0.024... |
dmrau/cqadubstack-unix-qrels | 2023-10-09T12:22:13.000Z | [
"region:us"
] | dmrau | null | null | 0 | 0 | 2023-10-09T12:22:11 | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: query-id
dtype: string
- name: corpus-id
dtype: string
- name: score
dtype: int64
splits:
- name: test
num_bytes: 44636
num_examples: 1693
download_size: 23577
dataset_size: 44636
---
# Dataset Card for "cqadubstack-unix-qrels"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 516 | [
[
-0.04046630859375,
0.0030956268310546875,
0.0224609375,
0.00783538818359375,
-0.0217742919921875,
0.0311279296875,
0.0292205810546875,
0.0009794235229492188,
0.048919677734375,
0.040191650390625,
-0.0543212890625,
-0.049835205078125,
-0.0262298583984375,
-0.... |
dmrau/cqudubstack-wordpress | 2023-10-09T12:22:20.000Z | [
"region:us"
] | dmrau | null | null | 0 | 0 | 2023-10-09T12:22:15 | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: queries
num_bytes: 35736
num_examples: 541
- name: corpus
num_bytes: 53026140
num_examples: 48605
download_size: 26551471
dataset_size: 53061876
---
# Dataset Card for "cqudubstack-wordpress"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 627 | [
[
-0.042449951171875,
-0.0188446044921875,
0.0170745849609375,
0.02337646484375,
-0.031341552734375,
0.025421142578125,
0.010101318359375,
-0.01418304443359375,
0.0657958984375,
0.029937744140625,
-0.062469482421875,
-0.06634521484375,
-0.038482666015625,
-0.0... |
dmrau/cqadubstack-wordpress-qrels | 2023-10-09T12:22:21.000Z | [
"region:us"
] | dmrau | null | null | 0 | 0 | 2023-10-09T12:22:20 | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: query-id
dtype: string
- name: corpus-id
dtype: string
- name: score
dtype: int64
splits:
- name: test
num_bytes: 19885
num_examples: 744
download_size: 11490
dataset_size: 19885
---
# Dataset Card for "cqadubstack-wordpress-qrels"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 520 | [
[
-0.0460205078125,
0.00402069091796875,
0.018707275390625,
0.0167999267578125,
-0.0290679931640625,
0.027923583984375,
0.0171966552734375,
-0.002506256103515625,
0.04974365234375,
0.0242156982421875,
-0.0628662109375,
-0.0638427734375,
-0.034759521484375,
-0.... |
open-llm-leaderboard/details_Open-Orca__Mistral-7B-OpenOrca | 2023-10-29T06:23:06.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-09T12:29:01 | ---
pretty_name: Evaluation run of Open-Orca/Mistral-7B-OpenOrca
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Open-Orca/Mistral-7B-OpenOrca](https://huggingface.co/Open-Orca/Mistral-7B-OpenOrca)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Open-Orca__Mistral-7B-OpenOrca\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-29T06:22:53.674218](https://huggingface.co/datasets/open-llm-leaderboard/details_Open-Orca__Mistral-7B-OpenOrca/blob/main/results_2023-10-29T06-22-53.674218.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.13716442953020133,\n\
\ \"em_stderr\": 0.003523095554552689,\n \"f1\": 0.20527894295301938,\n\
\ \"f1_stderr\": 0.00363436386580985,\n \"acc\": 0.48841023640281406,\n\
\ \"acc_stderr\": 0.011348185919594158\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.13716442953020133,\n \"em_stderr\": 0.003523095554552689,\n\
\ \"f1\": 0.20527894295301938,\n \"f1_stderr\": 0.00363436386580985\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.19939347990902198,\n \
\ \"acc_stderr\": 0.011005438029475652\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7774269928966061,\n \"acc_stderr\": 0.011690933809712666\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Open-Orca/Mistral-7B-OpenOrca
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|arc:challenge|25_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_29T06_22_53.674218
path:
- '**/details_harness|drop|3_2023-10-29T06-22-53.674218.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-29T06-22-53.674218.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_29T06_22_53.674218
path:
- '**/details_harness|gsm8k|5_2023-10-29T06-22-53.674218.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-29T06-22-53.674218.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hellaswag|10_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-28-38.184371.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T12-28-38.184371.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T12-28-38.184371.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_29T06_22_53.674218
path:
- '**/details_harness|winogrande|5_2023-10-29T06-22-53.674218.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-29T06-22-53.674218.parquet'
- config_name: results
data_files:
- split: 2023_10_09T12_28_38.184371
path:
- results_2023-10-09T12-28-38.184371.parquet
- split: 2023_10_29T06_22_53.674218
path:
- results_2023-10-29T06-22-53.674218.parquet
- split: latest
path:
- results_2023-10-29T06-22-53.674218.parquet
---
# Dataset Card for Evaluation run of Open-Orca/Mistral-7B-OpenOrca
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Open-Orca/Mistral-7B-OpenOrca
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Open-Orca/Mistral-7B-OpenOrca](https://huggingface.co/Open-Orca/Mistral-7B-OpenOrca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Open-Orca__Mistral-7B-OpenOrca",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-29T06:22:53.674218](https://huggingface.co/datasets/open-llm-leaderboard/details_Open-Orca__Mistral-7B-OpenOrca/blob/main/results_2023-10-29T06-22-53.674218.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.13716442953020133,
"em_stderr": 0.003523095554552689,
"f1": 0.20527894295301938,
"f1_stderr": 0.00363436386580985,
"acc": 0.48841023640281406,
"acc_stderr": 0.011348185919594158
},
"harness|drop|3": {
"em": 0.13716442953020133,
"em_stderr": 0.003523095554552689,
"f1": 0.20527894295301938,
"f1_stderr": 0.00363436386580985
},
"harness|gsm8k|5": {
"acc": 0.19939347990902198,
"acc_stderr": 0.011005438029475652
},
"harness|winogrande|5": {
"acc": 0.7774269928966061,
"acc_stderr": 0.011690933809712666
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,658 | [
[
-0.029937744140625,
-0.0509033203125,
0.007297515869140625,
0.01161956787109375,
-0.01025390625,
-0.002803802490234375,
-0.027374267578125,
-0.0197601318359375,
0.0278167724609375,
0.042388916015625,
-0.04376220703125,
-0.07635498046875,
-0.04437255859375,
0... |
open-llm-leaderboard/details_PulsarAI__Nebula-7B | 2023-10-23T05:55:10.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-09T12:30:01 | ---
pretty_name: Evaluation run of PulsarAI/Nebula-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [PulsarAI/Nebula-7B](https://huggingface.co/PulsarAI/Nebula-7B) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PulsarAI__Nebula-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-23T05:54:57.990759](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__Nebula-7B/blob/main/results_2023-10-23T05-54-57.990759.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.3613674496644295,\n\
\ \"em_stderr\": 0.004919712134554973,\n \"f1\": 0.4096088506711411,\n\
\ \"f1_stderr\": 0.00477602953566436,\n \"acc\": 0.4563034467407025,\n\
\ \"acc_stderr\": 0.01086566601540176\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.3613674496644295,\n \"em_stderr\": 0.004919712134554973,\n\
\ \"f1\": 0.4096088506711411,\n \"f1_stderr\": 0.00477602953566436\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.14859742228961334,\n \
\ \"acc_stderr\": 0.009797503180527892\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7640094711917916,\n \"acc_stderr\": 0.011933828850275625\n\
\ }\n}\n```"
repo_url: https://huggingface.co/PulsarAI/Nebula-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|arc:challenge|25_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_23T05_54_57.990759
path:
- '**/details_harness|drop|3_2023-10-23T05-54-57.990759.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-23T05-54-57.990759.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_23T05_54_57.990759
path:
- '**/details_harness|gsm8k|5_2023-10-23T05-54-57.990759.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-23T05-54-57.990759.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hellaswag|10_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-29-36.965037.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T12-29-36.965037.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T12-29-36.965037.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_23T05_54_57.990759
path:
- '**/details_harness|winogrande|5_2023-10-23T05-54-57.990759.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-23T05-54-57.990759.parquet'
- config_name: results
data_files:
- split: 2023_10_09T12_29_36.965037
path:
- results_2023-10-09T12-29-36.965037.parquet
- split: 2023_10_23T05_54_57.990759
path:
- results_2023-10-23T05-54-57.990759.parquet
- split: latest
path:
- results_2023-10-23T05-54-57.990759.parquet
---
# Dataset Card for Evaluation run of PulsarAI/Nebula-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/PulsarAI/Nebula-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [PulsarAI/Nebula-7B](https://huggingface.co/PulsarAI/Nebula-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PulsarAI__Nebula-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-23T05:54:57.990759](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__Nebula-7B/blob/main/results_2023-10-23T05-54-57.990759.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.3613674496644295,
"em_stderr": 0.004919712134554973,
"f1": 0.4096088506711411,
"f1_stderr": 0.00477602953566436,
"acc": 0.4563034467407025,
"acc_stderr": 0.01086566601540176
},
"harness|drop|3": {
"em": 0.3613674496644295,
"em_stderr": 0.004919712134554973,
"f1": 0.4096088506711411,
"f1_stderr": 0.00477602953566436
},
"harness|gsm8k|5": {
"acc": 0.14859742228961334,
"acc_stderr": 0.009797503180527892
},
"harness|winogrande|5": {
"acc": 0.7640094711917916,
"acc_stderr": 0.011933828850275625
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,514 | [
[
-0.0291290283203125,
-0.0438232421875,
0.0177764892578125,
0.0209503173828125,
-0.0150909423828125,
0.009429931640625,
-0.0220794677734375,
-0.01113128662109375,
0.037261962890625,
0.038818359375,
-0.050445556640625,
-0.062347412109375,
-0.049835205078125,
0... |
dmrau/cqudupstack-android | 2023-10-09T12:36:34.000Z | [
"region:us"
] | dmrau | null | null | 0 | 0 | 2023-10-09T12:36:30 | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: queries
num_bytes: 47953
num_examples: 699
- name: corpus
num_bytes: 12840959
num_examples: 22998
download_size: 7657118
dataset_size: 12888912
---
# Dataset Card for "cqudupstack-android"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 624 | [
[
-0.039825439453125,
-0.0118408203125,
0.0191650390625,
0.0219573974609375,
-0.0303955078125,
0.027618408203125,
0.03131103515625,
-0.02203369140625,
0.0692138671875,
0.0401611328125,
-0.06292724609375,
-0.0438232421875,
-0.0305938720703125,
-0.020751953125,
... |
dmrau/cqudupstack-gaming | 2023-10-09T12:36:47.000Z | [
"region:us"
] | dmrau | null | null | 0 | 0 | 2023-10-09T12:36:43 | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: queries
num_bytes: 105494
num_examples: 1595
- name: corpus
num_bytes: 20666596
num_examples: 45301
download_size: 12946080
dataset_size: 20772090
---
# Dataset Card for "cqudupstack-gaming"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 626 | [
[
-0.043182373046875,
-0.0267486572265625,
0.012542724609375,
0.02783203125,
-0.01473236083984375,
0.023681640625,
0.023681640625,
-0.01177978515625,
0.0604248046875,
0.0296173095703125,
-0.07080078125,
-0.052581787109375,
-0.0261688232421875,
-0.0312805175781... |
dmrau/cqudupstack-mathematica | 2023-10-09T12:36:57.000Z | [
"region:us"
] | dmrau | null | null | 0 | 0 | 2023-10-09T12:36:53 | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: queries
num_bytes: 52792
num_examples: 804
- name: corpus
num_bytes: 18735825
num_examples: 16705
download_size: 10393860
dataset_size: 18788617
---
# Dataset Card for "cqudupstack-mathematica"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 629 | [
[
-0.03533935546875,
-0.01253509521484375,
0.006771087646484375,
0.0232086181640625,
-0.013153076171875,
0.033966064453125,
0.021636962890625,
-0.0163726806640625,
0.059600830078125,
0.031951904296875,
-0.06829833984375,
-0.0533447265625,
-0.03375244140625,
-0... |
dmrau/cqudupstack-programmers | 2023-10-09T12:37:07.000Z | [
"region:us"
] | dmrau | null | null | 0 | 0 | 2023-10-09T12:37:03 | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: queries
num_bytes: 63785
num_examples: 876
- name: corpus
num_bytes: 32727262
num_examples: 32176
download_size: 19360000
dataset_size: 32791047
---
# Dataset Card for "cqudupstack-programmers"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 629 | [
[
-0.036712646484375,
-0.01006317138671875,
0.0127105712890625,
0.02923583984375,
-0.0028209686279296875,
0.036163330078125,
0.0195465087890625,
-0.01538848876953125,
0.058990478515625,
0.037628173828125,
-0.056640625,
-0.0477294921875,
-0.033111572265625,
-0.... |
open-llm-leaderboard/details_Weyaxi__Samantha-Nebula-7B | 2023-10-24T22:52:46.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-09T12:37:09 | ---
pretty_name: Evaluation run of Weyaxi/Samantha-Nebula-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Weyaxi/Samantha-Nebula-7B](https://huggingface.co/Weyaxi/Samantha-Nebula-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__Samantha-Nebula-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T22:52:33.668661](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Samantha-Nebula-7B/blob/main/results_2023-10-24T22-52-33.668661.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.3792994966442953,\n\
\ \"em_stderr\": 0.004969032454438954,\n \"f1\": 0.4256501677852355,\n\
\ \"f1_stderr\": 0.0048455756354128885,\n \"acc\": 0.42229140848972546,\n\
\ \"acc_stderr\": 0.010604861041151385\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.3792994966442953,\n \"em_stderr\": 0.004969032454438954,\n\
\ \"f1\": 0.4256501677852355,\n \"f1_stderr\": 0.0048455756354128885\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11372251705837756,\n \
\ \"acc_stderr\": 0.008744810131034036\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7308602999210734,\n \"acc_stderr\": 0.012464911951268734\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Weyaxi/Samantha-Nebula-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|arc:challenge|25_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T22_52_33.668661
path:
- '**/details_harness|drop|3_2023-10-24T22-52-33.668661.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T22-52-33.668661.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T22_52_33.668661
path:
- '**/details_harness|gsm8k|5_2023-10-24T22-52-33.668661.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T22-52-33.668661.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hellaswag|10_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T22_52_33.668661
path:
- '**/details_harness|winogrande|5_2023-10-24T22-52-33.668661.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T22-52-33.668661.parquet'
- config_name: results
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- results_2023-10-09T12-36-46.129297.parquet
- split: 2023_10_24T22_52_33.668661
path:
- results_2023-10-24T22-52-33.668661.parquet
- split: latest
path:
- results_2023-10-24T22-52-33.668661.parquet
---
# Dataset Card for Evaluation run of Weyaxi/Samantha-Nebula-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Weyaxi/Samantha-Nebula-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Weyaxi/Samantha-Nebula-7B](https://huggingface.co/Weyaxi/Samantha-Nebula-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__Samantha-Nebula-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T22:52:33.668661](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Samantha-Nebula-7B/blob/main/results_2023-10-24T22-52-33.668661.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.3792994966442953,
"em_stderr": 0.004969032454438954,
"f1": 0.4256501677852355,
"f1_stderr": 0.0048455756354128885,
"acc": 0.42229140848972546,
"acc_stderr": 0.010604861041151385
},
"harness|drop|3": {
"em": 0.3792994966442953,
"em_stderr": 0.004969032454438954,
"f1": 0.4256501677852355,
"f1_stderr": 0.0048455756354128885
},
"harness|gsm8k|5": {
"acc": 0.11372251705837756,
"acc_stderr": 0.008744810131034036
},
"harness|winogrande|5": {
"acc": 0.7308602999210734,
"acc_stderr": 0.012464911951268734
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,610 | [
[
-0.02410888671875,
-0.042083740234375,
0.020294189453125,
0.0125885009765625,
-0.0184326171875,
0.01084136962890625,
-0.0195770263671875,
-0.0131378173828125,
0.03338623046875,
0.04217529296875,
-0.051422119140625,
-0.0732421875,
-0.05523681640625,
0.0115280... |
dmrau/cqudupstack-tex | 2023-10-09T12:37:57.000Z | [
"region:us"
] | dmrau | null | null | 0 | 0 | 2023-10-09T12:37:52 | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: queries
num_bytes: 186934
num_examples: 2906
- name: corpus
num_bytes: 86600423
num_examples: 68184
download_size: 43424126
dataset_size: 86787357
---
# Dataset Card for "cqudupstack-tex"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 623 | [
[
-0.036163330078125,
-0.017913818359375,
0.0225830078125,
0.018402099609375,
-0.01763916015625,
0.0328369140625,
0.0211181640625,
-0.0187225341796875,
0.059295654296875,
0.039581298828125,
-0.053192138671875,
-0.057281494140625,
-0.04010009765625,
-0.02308654... |
dmrau/cqudupstack-webmasters | 2023-10-09T12:38:04.000Z | [
"region:us"
] | dmrau | null | null | 0 | 0 | 2023-10-09T12:38:01 | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: queries
num_bytes: 34792
num_examples: 506
- name: corpus
num_bytes: 11659413
num_examples: 17405
download_size: 6885106
dataset_size: 11694205
---
# Dataset Card for "cqudupstack-webmasters"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 627 | [
[
-0.042816162109375,
-0.006946563720703125,
-0.0030689239501953125,
0.026885986328125,
-0.01119232177734375,
0.0283966064453125,
0.016448974609375,
-0.019012451171875,
0.04949951171875,
0.04498291015625,
-0.0625,
-0.045745849609375,
-0.031951904296875,
-0.013... |
dmrau/cqudupstack-english | 2023-10-09T12:38:18.000Z | [
"region:us"
] | dmrau | null | null | 0 | 0 | 2023-10-09T12:38:14 | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: queries
num_bytes: 103588
num_examples: 1570
- name: corpus
num_bytes: 18199570
num_examples: 40221
download_size: 11382247
dataset_size: 18303158
---
# Dataset Card for "cqudupstack-english"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 627 | [
[
-0.024322509765625,
-0.0172119140625,
0.007659912109375,
0.032745361328125,
-0.026336669921875,
0.0259552001953125,
0.0014009475708007812,
-0.02667236328125,
0.070068359375,
0.03350830078125,
-0.051513671875,
-0.055999755859375,
-0.041107177734375,
-0.009613... |
dmrau/cqudupstack-gis | 2023-10-09T12:38:30.000Z | [
"region:us"
] | dmrau | null | null | 0 | 0 | 2023-10-09T12:38:25 | ---
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: queries
num_bytes: 61244
num_examples: 885
- name: corpus
num_bytes: 36704924
num_examples: 37637
download_size: 20083359
dataset_size: 36766168
---
# Dataset Card for "cqudupstack-gis"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 621 | [
[
-0.039398193359375,
-0.01290130615234375,
0.0138092041015625,
0.00838470458984375,
-0.0199432373046875,
0.029205322265625,
0.0225830078125,
-0.00726318359375,
0.06707763671875,
0.028350830078125,
-0.05267333984375,
-0.059661865234375,
-0.0450439453125,
-0.02... |
dmrau/cqudupstack-physics | 2023-10-09T12:38:36.000Z | [
"region:us"
] | dmrau | null | null | 0 | 0 | 2023-10-09T12:38:36 | Entry not found | 15 | [
[
-0.021392822265625,
-0.01494598388671875,
0.05718994140625,
0.028839111328125,
-0.0350341796875,
0.046539306640625,
0.052490234375,
0.00507354736328125,
0.051361083984375,
0.01702880859375,
-0.052093505859375,
-0.01494598388671875,
-0.06036376953125,
0.03790... |
harinarayan/my_final_dataset | 2023-10-09T12:39:16.000Z | [
"region:us"
] | harinarayan | null | null | 0 | 0 | 2023-10-09T12:39:13 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 472226.0
num_examples: 33
download_size: 471990
dataset_size: 472226.0
---
# Dataset Card for "my_final_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 443 | [
[
-0.052734375,
-0.013702392578125,
0.0238189697265625,
0.005565643310546875,
-0.005382537841796875,
0.00536346435546875,
0.0124053955078125,
-0.001041412353515625,
0.06378173828125,
0.04962158203125,
-0.061737060546875,
-0.05224609375,
-0.034210205078125,
-0.... |
open-llm-leaderboard/details_akjindal53244__Mistral-7B-v0.1-Open-Platypus | 2023-10-25T03:30:50.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-09T12:53:05 | ---
pretty_name: Evaluation run of akjindal53244/Mistral-7B-v0.1-Open-Platypus
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [akjindal53244/Mistral-7B-v0.1-Open-Platypus](https://huggingface.co/akjindal53244/Mistral-7B-v0.1-Open-Platypus)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_akjindal53244__Mistral-7B-v0.1-Open-Platypus\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-25T03:30:37.870273](https://huggingface.co/datasets/open-llm-leaderboard/details_akjindal53244__Mistral-7B-v0.1-Open-Platypus/blob/main/results_2023-10-25T03-30-37.870273.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.16128355704697986,\n\
\ \"em_stderr\": 0.0037665373341562473,\n \"f1\": 0.21934249161073788,\n\
\ \"f1_stderr\": 0.003766121643482467,\n \"acc\": 0.47474797642135197,\n\
\ \"acc_stderr\": 0.011060564905702893\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.16128355704697986,\n \"em_stderr\": 0.0037665373341562473,\n\
\ \"f1\": 0.21934249161073788,\n \"f1_stderr\": 0.003766121643482467\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1728582259287339,\n \
\ \"acc_stderr\": 0.010415432246200586\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.77663772691397,\n \"acc_stderr\": 0.011705697565205201\n\
\ }\n}\n```"
repo_url: https://huggingface.co/akjindal53244/Mistral-7B-v0.1-Open-Platypus
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|arc:challenge|25_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_25T03_30_37.870273
path:
- '**/details_harness|drop|3_2023-10-25T03-30-37.870273.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-25T03-30-37.870273.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_25T03_30_37.870273
path:
- '**/details_harness|gsm8k|5_2023-10-25T03-30-37.870273.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-25T03-30-37.870273.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hellaswag|10_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-52-41.880840.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T12-52-41.880840.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T12-52-41.880840.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_25T03_30_37.870273
path:
- '**/details_harness|winogrande|5_2023-10-25T03-30-37.870273.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-25T03-30-37.870273.parquet'
- config_name: results
data_files:
- split: 2023_10_09T12_52_41.880840
path:
- results_2023-10-09T12-52-41.880840.parquet
- split: 2023_10_25T03_30_37.870273
path:
- results_2023-10-25T03-30-37.870273.parquet
- split: latest
path:
- results_2023-10-25T03-30-37.870273.parquet
---
# Dataset Card for Evaluation run of akjindal53244/Mistral-7B-v0.1-Open-Platypus
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/akjindal53244/Mistral-7B-v0.1-Open-Platypus
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [akjindal53244/Mistral-7B-v0.1-Open-Platypus](https://huggingface.co/akjindal53244/Mistral-7B-v0.1-Open-Platypus) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_akjindal53244__Mistral-7B-v0.1-Open-Platypus",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-25T03:30:37.870273](https://huggingface.co/datasets/open-llm-leaderboard/details_akjindal53244__Mistral-7B-v0.1-Open-Platypus/blob/main/results_2023-10-25T03-30-37.870273.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.16128355704697986,
"em_stderr": 0.0037665373341562473,
"f1": 0.21934249161073788,
"f1_stderr": 0.003766121643482467,
"acc": 0.47474797642135197,
"acc_stderr": 0.011060564905702893
},
"harness|drop|3": {
"em": 0.16128355704697986,
"em_stderr": 0.0037665373341562473,
"f1": 0.21934249161073788,
"f1_stderr": 0.003766121643482467
},
"harness|gsm8k|5": {
"acc": 0.1728582259287339,
"acc_stderr": 0.010415432246200586
},
"harness|winogrande|5": {
"acc": 0.77663772691397,
"acc_stderr": 0.011705697565205201
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,828 | [
[
-0.028564453125,
-0.045135498046875,
0.0128326416015625,
0.0205230712890625,
-0.01529693603515625,
0.0014362335205078125,
-0.025054931640625,
-0.00626373291015625,
0.02484130859375,
0.044219970703125,
-0.045440673828125,
-0.0665283203125,
-0.048736572265625,
... |
open-llm-leaderboard/details_PeanutJar__Mistral-v0.1-PeanutButter-v0.0.0-7B | 2023-10-28T17:59:30.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-09T13:04:20 | ---
pretty_name: Evaluation run of PeanutJar/Mistral-v0.1-PeanutButter-v0.0.0-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [PeanutJar/Mistral-v0.1-PeanutButter-v0.0.0-7B](https://huggingface.co/PeanutJar/Mistral-v0.1-PeanutButter-v0.0.0-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PeanutJar__Mistral-v0.1-PeanutButter-v0.0.0-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-28T17:59:18.672226](https://huggingface.co/datasets/open-llm-leaderboard/details_PeanutJar__Mistral-v0.1-PeanutButter-v0.0.0-7B/blob/main/results_2023-10-28T17-59-18.672226.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.007655201342281879,\n\
\ \"em_stderr\": 0.0008925843316825968,\n \"f1\": 0.06762374161073832,\n\
\ \"f1_stderr\": 0.0015672145775403328,\n \"acc\": 0.48594340621826704,\n\
\ \"acc_stderr\": 0.011102174081480334\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.007655201342281879,\n \"em_stderr\": 0.0008925843316825968,\n\
\ \"f1\": 0.06762374161073832,\n \"f1_stderr\": 0.0015672145775403328\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.18498862774829417,\n \
\ \"acc_stderr\": 0.010695390472237908\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7868981846882399,\n \"acc_stderr\": 0.01150895769072276\n\
\ }\n}\n```"
repo_url: https://huggingface.co/PeanutJar/Mistral-v0.1-PeanutButter-v0.0.0-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|arc:challenge|25_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_28T17_59_18.672226
path:
- '**/details_harness|drop|3_2023-10-28T17-59-18.672226.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-28T17-59-18.672226.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_28T17_59_18.672226
path:
- '**/details_harness|gsm8k|5_2023-10-28T17-59-18.672226.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-28T17-59-18.672226.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hellaswag|10_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_28T17_59_18.672226
path:
- '**/details_harness|winogrande|5_2023-10-28T17-59-18.672226.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-28T17-59-18.672226.parquet'
- config_name: results
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- results_2023-10-09T13-03-57.822479.parquet
- split: 2023_10_28T17_59_18.672226
path:
- results_2023-10-28T17-59-18.672226.parquet
- split: latest
path:
- results_2023-10-28T17-59-18.672226.parquet
---
# Dataset Card for Evaluation run of PeanutJar/Mistral-v0.1-PeanutButter-v0.0.0-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/PeanutJar/Mistral-v0.1-PeanutButter-v0.0.0-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [PeanutJar/Mistral-v0.1-PeanutButter-v0.0.0-7B](https://huggingface.co/PeanutJar/Mistral-v0.1-PeanutButter-v0.0.0-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PeanutJar__Mistral-v0.1-PeanutButter-v0.0.0-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-28T17:59:18.672226](https://huggingface.co/datasets/open-llm-leaderboard/details_PeanutJar__Mistral-v0.1-PeanutButter-v0.0.0-7B/blob/main/results_2023-10-28T17-59-18.672226.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.007655201342281879,
"em_stderr": 0.0008925843316825968,
"f1": 0.06762374161073832,
"f1_stderr": 0.0015672145775403328,
"acc": 0.48594340621826704,
"acc_stderr": 0.011102174081480334
},
"harness|drop|3": {
"em": 0.007655201342281879,
"em_stderr": 0.0008925843316825968,
"f1": 0.06762374161073832,
"f1_stderr": 0.0015672145775403328
},
"harness|gsm8k|5": {
"acc": 0.18498862774829417,
"acc_stderr": 0.010695390472237908
},
"harness|winogrande|5": {
"acc": 0.7868981846882399,
"acc_stderr": 0.01150895769072276
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,864 | [
[
-0.0295562744140625,
-0.05474853515625,
0.00629425048828125,
0.026947021484375,
-0.0080718994140625,
0.0114593505859375,
-0.0297393798828125,
-0.01548004150390625,
0.0283050537109375,
0.03509521484375,
-0.04486083984375,
-0.06390380859375,
-0.053680419921875,
... |
nitinbhayana/review-phrases-sentiments-v2 | 2023-10-09T13:09:06.000Z | [
"region:us"
] | nitinbhayana | null | null | 0 | 0 | 2023-10-09T13:08:13 | Entry not found | 15 | [
[
-0.0213775634765625,
-0.01497650146484375,
0.05718994140625,
0.02880859375,
-0.0350341796875,
0.046478271484375,
0.052490234375,
0.00507354736328125,
0.051361083984375,
0.0170135498046875,
-0.052093505859375,
-0.01497650146484375,
-0.0604248046875,
0.0379028... |
Rootreck/so-vits-svc-4.0-Fallout_4 | 2023-10-17T13:27:29.000Z | [
"region:us"
] | Rootreck | null | null | 0 | 0 | 2023-10-09T13:08:23 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
nitinbhayana/keyword-category-brand-v1 | 2023-10-09T13:11:05.000Z | [
"region:us"
] | nitinbhayana | null | null | 0 | 0 | 2023-10-09T13:10:28 | Entry not found | 15 | [
[
-0.021392822265625,
-0.01494598388671875,
0.05718994140625,
0.028839111328125,
-0.0350341796875,
0.046539306640625,
0.052490234375,
0.00507354736328125,
0.051361083984375,
0.01702880859375,
-0.052093505859375,
-0.01494598388671875,
-0.06036376953125,
0.03790... |
harinarayan/my_newest_dataset | 2023-10-09T13:22:45.000Z | [
"region:us"
] | harinarayan | null | null | 0 | 0 | 2023-10-09T13:22:43 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 1558328.0
num_examples: 36
download_size: 1436147
dataset_size: 1558328.0
---
# Dataset Card for "my_newest_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 447 | [
[
-0.057220458984375,
-0.0237579345703125,
0.00496673583984375,
0.0078277587890625,
-0.004669189453125,
0.00421142578125,
0.022613525390625,
-0.01010894775390625,
0.06463623046875,
0.03839111328125,
-0.057464599609375,
-0.058074951171875,
-0.0386962890625,
-0.... |
LoliUsa/nva-xxx2 | 2023-10-09T13:23:42.000Z | [
"region:us"
] | LoliUsa | null | null | 0 | 0 | 2023-10-09T13:23:15 | Entry not found | 15 | [
[
-0.021392822265625,
-0.01494598388671875,
0.05718994140625,
0.028839111328125,
-0.0350341796875,
0.046539306640625,
0.052490234375,
0.00507354736328125,
0.051361083984375,
0.01702880859375,
-0.052093505859375,
-0.01494598388671875,
-0.06036376953125,
0.03790... |
ninjawick/cagliostro-colab-ui | 2023-10-09T13:25:44.000Z | [
"region:us"
] | ninjawick | null | null | 0 | 0 | 2023-10-09T13:25:44 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
harinarayan/my_small_dataset | 2023-10-09T13:50:48.000Z | [
"region:us"
] | harinarayan | null | null | 0 | 0 | 2023-10-09T13:50:47 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 445121.0
num_examples: 8
download_size: 417058
dataset_size: 445121.0
---
# Dataset Card for "my_small_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 442 | [
[
-0.05450439453125,
-0.0173797607421875,
0.02001953125,
0.0059356689453125,
-0.00958251953125,
-0.01515960693359375,
0.00762939453125,
-0.0008006095886230469,
0.07647705078125,
0.0280609130859375,
-0.0567626953125,
-0.037078857421875,
-0.034637451171875,
-0.0... |
sooyeon/autotrain-data-flan-t5-large-financial-phrasebank-lora | 2023-10-09T13:51:35.000Z | [
"region:us"
] | sooyeon | null | null | 0 | 0 | 2023-10-09T13:51:35 | Entry not found | 15 | [
[
-0.021392822265625,
-0.01494598388671875,
0.05718994140625,
0.028839111328125,
-0.0350341796875,
0.046539306640625,
0.052490234375,
0.00507354736328125,
0.051361083984375,
0.01702880859375,
-0.052093505859375,
-0.01494598388671875,
-0.06036376953125,
0.03790... |
jamsonE/myself | 2023-10-09T13:52:53.000Z | [
"region:us"
] | jamsonE | null | null | 0 | 0 | 2023-10-09T13:52:26 | Entry not found | 15 | [
[
-0.021392822265625,
-0.01494598388671875,
0.05718994140625,
0.028839111328125,
-0.0350341796875,
0.046539306640625,
0.052490234375,
0.00507354736328125,
0.051361083984375,
0.01702880859375,
-0.052093505859375,
-0.01494598388671875,
-0.06036376953125,
0.03790... |
harinarayan/my_tiny_dataset | 2023-10-09T13:59:03.000Z | [
"region:us"
] | harinarayan | null | null | 0 | 0 | 2023-10-09T13:54:28 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 445121.0
num_examples: 8
download_size: 0
dataset_size: 445121.0
---
# Dataset Card for "my_tiny_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 436 | [
[
-0.050506591796875,
-0.0189056396484375,
0.020721435546875,
0.0065765380859375,
-0.00934600830078125,
-0.0089111328125,
0.00905609130859375,
0.0019931793212890625,
0.07489013671875,
0.022369384765625,
-0.0574951171875,
-0.03521728515625,
-0.032318115234375,
... |
jamsonE/doc | 2023-10-09T13:54:50.000Z | [
"region:us"
] | jamsonE | null | null | 0 | 0 | 2023-10-09T13:54:50 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
YSFF/my_loras | 2023-10-09T14:46:28.000Z | [
"region:us"
] | YSFF | null | null | 0 | 0 | 2023-10-09T14:41:18 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
W1lson/testt2 | 2023-10-09T14:58:09.000Z | [
"region:us"
] | W1lson | null | null | 0 | 0 | 2023-10-09T14:56:03 | ---
dataset_info:
features:
- name: Category
dtype: string
- name: Description
dtype: string
splits:
- name: train
num_bytes: 383
num_examples: 5
download_size: 1879
dataset_size: 383
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "testt2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 464 | [
[
-0.027984619140625,
-0.01401519775390625,
0.0100860595703125,
0.01029205322265625,
-0.01354217529296875,
0.0005345344543457031,
0.0214996337890625,
-0.01194000244140625,
0.03070068359375,
0.01403045654296875,
-0.048553466796875,
-0.034515380859375,
-0.0404968261... |
RorooroR/BossaNova | 2023-10-09T17:13:19.000Z | [
"region:us"
] | RorooroR | null | null | 0 | 0 | 2023-10-09T15:05:16 | ---
dataset_info:
features:
- name: image
dtype: image
- name: audio_file
dtype: string
- name: slice
dtype: int16
splits:
- name: train
num_bytes: 1147109518.125
num_examples: 27791
download_size: 1143310714
dataset_size: 1147109518.125
---
# Dataset Card for "BossaNova"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 441 | [
[
-0.037750244140625,
-0.01312255859375,
0.01338958740234375,
0.020050048828125,
-0.03094482421875,
-0.01099395751953125,
0.007106781005859375,
-0.0221405029296875,
0.06634521484375,
0.0252532958984375,
-0.064208984375,
-0.060272216796875,
-0.032257080078125,
... |
Globaly/familias5k | 2023-10-09T15:17:31.000Z | [
"region:us"
] | Globaly | null | null | 0 | 0 | 2023-10-09T15:16:34 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
selinerdem/test-german-orca | 2023-10-09T15:18:27.000Z | [
"region:us"
] | selinerdem | null | null | 0 | 0 | 2023-10-09T15:17:26 | ---
# For reference on model card metadata, see the spec: https://github.com/huggingface/hub-docs/blob/main/datasetcard.md?plain=1
# Doc / guide: https://huggingface.co/docs/hub/datasets-cards
{}
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 1,732 | [
[
-0.038177490234375,
-0.02984619140625,
-0.0036067962646484375,
0.027130126953125,
-0.0323486328125,
0.0037822723388671875,
-0.01727294921875,
-0.02020263671875,
0.049041748046875,
0.04046630859375,
-0.0634765625,
-0.08062744140625,
-0.052947998046875,
0.0020... |
jpiorko/genai_survey_dataset | 2023-10-09T15:18:57.000Z | [
"region:us"
] | jpiorko | null | null | 0 | 0 | 2023-10-09T15:18:57 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
open-llm-leaderboard/details_budecosystem__boomer-1b | 2023-10-24T14:53:36.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-09T15:38:00 | ---
pretty_name: Evaluation run of budecosystem/boomer-1b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [budecosystem/boomer-1b](https://huggingface.co/budecosystem/boomer-1b) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_budecosystem__boomer-1b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T14:53:25.007106](https://huggingface.co/datasets/open-llm-leaderboard/details_budecosystem__boomer-1b/blob/main/results_2023-10-24T14-53-25.007106.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0007340604026845638,\n\
\ \"em_stderr\": 0.0002773614457335763,\n \"f1\": 0.052141359060402785,\n\
\ \"f1_stderr\": 0.0013172260484977333,\n \"acc\": 0.2571140151259026,\n\
\ \"acc_stderr\": 0.008333536236283095\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0007340604026845638,\n \"em_stderr\": 0.0002773614457335763,\n\
\ \"f1\": 0.052141359060402785,\n \"f1_stderr\": 0.0013172260484977333\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.009097801364670205,\n \
\ \"acc_stderr\": 0.002615326510775672\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.505130228887135,\n \"acc_stderr\": 0.014051745961790516\n\
\ }\n}\n```"
repo_url: https://huggingface.co/budecosystem/boomer-1b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|arc:challenge|25_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T14_53_25.007106
path:
- '**/details_harness|drop|3_2023-10-24T14-53-25.007106.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T14-53-25.007106.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T14_53_25.007106
path:
- '**/details_harness|gsm8k|5_2023-10-24T14-53-25.007106.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T14-53-25.007106.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hellaswag|10_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T15-37-37.200624.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T15-37-37.200624.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T15-37-37.200624.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T14_53_25.007106
path:
- '**/details_harness|winogrande|5_2023-10-24T14-53-25.007106.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T14-53-25.007106.parquet'
- config_name: results
data_files:
- split: 2023_10_09T15_37_37.200624
path:
- results_2023-10-09T15-37-37.200624.parquet
- split: 2023_10_24T14_53_25.007106
path:
- results_2023-10-24T14-53-25.007106.parquet
- split: latest
path:
- results_2023-10-24T14-53-25.007106.parquet
---
# Dataset Card for Evaluation run of budecosystem/boomer-1b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/budecosystem/boomer-1b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [budecosystem/boomer-1b](https://huggingface.co/budecosystem/boomer-1b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_budecosystem__boomer-1b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T14:53:25.007106](https://huggingface.co/datasets/open-llm-leaderboard/details_budecosystem__boomer-1b/blob/main/results_2023-10-24T14-53-25.007106.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0007340604026845638,
"em_stderr": 0.0002773614457335763,
"f1": 0.052141359060402785,
"f1_stderr": 0.0013172260484977333,
"acc": 0.2571140151259026,
"acc_stderr": 0.008333536236283095
},
"harness|drop|3": {
"em": 0.0007340604026845638,
"em_stderr": 0.0002773614457335763,
"f1": 0.052141359060402785,
"f1_stderr": 0.0013172260484977333
},
"harness|gsm8k|5": {
"acc": 0.009097801364670205,
"acc_stderr": 0.002615326510775672
},
"harness|winogrande|5": {
"acc": 0.505130228887135,
"acc_stderr": 0.014051745961790516
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,596 | [
[
-0.03094482421875,
-0.0548095703125,
0.00905609130859375,
0.018035888671875,
-0.0167999267578125,
0.00926971435546875,
-0.0266876220703125,
-0.01444244384765625,
0.0218353271484375,
0.033203125,
-0.05224609375,
-0.06988525390625,
-0.048004150390625,
0.012283... |
open-llm-leaderboard/details_caisarl76__Mistral-7B-OpenOrca-Guanaco-accu16 | 2023-10-26T08:32:46.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-09T15:53:34 | ---
pretty_name: Evaluation run of caisarl76/Mistral-7B-OpenOrca-Guanaco-accu16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [caisarl76/Mistral-7B-OpenOrca-Guanaco-accu16](https://huggingface.co/caisarl76/Mistral-7B-OpenOrca-Guanaco-accu16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_caisarl76__Mistral-7B-OpenOrca-Guanaco-accu16\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-26T08:32:33.327127](https://huggingface.co/datasets/open-llm-leaderboard/details_caisarl76__Mistral-7B-OpenOrca-Guanaco-accu16/blob/main/results_2023-10-26T08-32-33.327127.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0045092281879194635,\n\
\ \"em_stderr\": 0.0006861346899094969,\n \"f1\": 0.08383808724832231,\n\
\ \"f1_stderr\": 0.0017696414807013908,\n \"acc\": 0.46277883857625746,\n\
\ \"acc_stderr\": 0.011001753966995261\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0045092281879194635,\n \"em_stderr\": 0.0006861346899094969,\n\
\ \"f1\": 0.08383808724832231,\n \"f1_stderr\": 0.0017696414807013908\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1599696739954511,\n \
\ \"acc_stderr\": 0.010097377827752538\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7655880031570639,\n \"acc_stderr\": 0.011906130106237986\n\
\ }\n}\n```"
repo_url: https://huggingface.co/caisarl76/Mistral-7B-OpenOrca-Guanaco-accu16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|arc:challenge|25_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_26T08_32_33.327127
path:
- '**/details_harness|drop|3_2023-10-26T08-32-33.327127.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-26T08-32-33.327127.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_26T08_32_33.327127
path:
- '**/details_harness|gsm8k|5_2023-10-26T08-32-33.327127.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-26T08-32-33.327127.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hellaswag|10_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T15-53-10.944584.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T15-53-10.944584.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T15-53-10.944584.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_26T08_32_33.327127
path:
- '**/details_harness|winogrande|5_2023-10-26T08-32-33.327127.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-26T08-32-33.327127.parquet'
- config_name: results
data_files:
- split: 2023_10_09T15_53_10.944584
path:
- results_2023-10-09T15-53-10.944584.parquet
- split: 2023_10_26T08_32_33.327127
path:
- results_2023-10-26T08-32-33.327127.parquet
- split: latest
path:
- results_2023-10-26T08-32-33.327127.parquet
---
# Dataset Card for Evaluation run of caisarl76/Mistral-7B-OpenOrca-Guanaco-accu16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/caisarl76/Mistral-7B-OpenOrca-Guanaco-accu16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [caisarl76/Mistral-7B-OpenOrca-Guanaco-accu16](https://huggingface.co/caisarl76/Mistral-7B-OpenOrca-Guanaco-accu16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_caisarl76__Mistral-7B-OpenOrca-Guanaco-accu16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-26T08:32:33.327127](https://huggingface.co/datasets/open-llm-leaderboard/details_caisarl76__Mistral-7B-OpenOrca-Guanaco-accu16/blob/main/results_2023-10-26T08-32-33.327127.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0045092281879194635,
"em_stderr": 0.0006861346899094969,
"f1": 0.08383808724832231,
"f1_stderr": 0.0017696414807013908,
"acc": 0.46277883857625746,
"acc_stderr": 0.011001753966995261
},
"harness|drop|3": {
"em": 0.0045092281879194635,
"em_stderr": 0.0006861346899094969,
"f1": 0.08383808724832231,
"f1_stderr": 0.0017696414807013908
},
"harness|gsm8k|5": {
"acc": 0.1599696739954511,
"acc_stderr": 0.010097377827752538
},
"harness|winogrande|5": {
"acc": 0.7655880031570639,
"acc_stderr": 0.011906130106237986
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,856 | [
[
-0.0302276611328125,
-0.045257568359375,
0.01202392578125,
0.0181121826171875,
-0.01169586181640625,
0.00402069091796875,
-0.02740478515625,
-0.01282501220703125,
0.0258941650390625,
0.04034423828125,
-0.044219970703125,
-0.0726318359375,
-0.047576904296875,
... |
open-llm-leaderboard/details_nicholasKluge__Aira-1B5 | 2023-10-09T15:55:59.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-09T15:55:01 | ---
pretty_name: Evaluation run of nicholasKluge/Aira-1B5
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [nicholasKluge/Aira-1B5](https://huggingface.co/nicholasKluge/Aira-1B5) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nicholasKluge__Aira-1B5\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-09T15:54:46.926141](https://huggingface.co/datasets/open-llm-leaderboard/details_nicholasKluge__Aira-1B5/blob/main/results_2023-10-09T15-54-46.926141.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2743564373642291,\n\
\ \"acc_stderr\": 0.03211959266297477,\n \"acc_norm\": 0.27587655832273894,\n\
\ \"acc_norm_stderr\": 0.0321270755179637,\n \"mc1\": 0.2386780905752754,\n\
\ \"mc1_stderr\": 0.014922629695456418,\n \"mc2\": 0.4115839931034755,\n\
\ \"mc2_stderr\": 0.015541548311642976\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2687713310580205,\n \"acc_stderr\": 0.01295506596371069,\n\
\ \"acc_norm\": 0.28924914675767915,\n \"acc_norm_stderr\": 0.013250012579393443\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.36188010356502687,\n\
\ \"acc_stderr\": 0.004795622757327151,\n \"acc_norm\": 0.43108942441744674,\n\
\ \"acc_norm_stderr\": 0.004942164585991465\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.035914440841969694,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.035914440841969694\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.29605263157894735,\n \"acc_stderr\": 0.037150621549989056,\n\
\ \"acc_norm\": 0.29605263157894735,\n \"acc_norm_stderr\": 0.037150621549989056\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.3018867924528302,\n \"acc_stderr\": 0.028254200344438665,\n\
\ \"acc_norm\": 0.3018867924528302,\n \"acc_norm_stderr\": 0.028254200344438665\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.32947976878612717,\n\
\ \"acc_stderr\": 0.03583901754736411,\n \"acc_norm\": 0.32947976878612717,\n\
\ \"acc_norm_stderr\": 0.03583901754736411\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082633,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082633\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n\
\ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.20851063829787234,\n \"acc_stderr\": 0.026556982117838728,\n\
\ \"acc_norm\": 0.20851063829787234,\n \"acc_norm_stderr\": 0.026556982117838728\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813344,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813344\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.26455026455026454,\n \"acc_stderr\": 0.022717467897708604,\n \"\
acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.022717467897708604\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653694\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24193548387096775,\n\
\ \"acc_stderr\": 0.0243625996930311,\n \"acc_norm\": 0.24193548387096775,\n\
\ \"acc_norm_stderr\": 0.0243625996930311\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.28078817733990147,\n \"acc_stderr\": 0.03161856335358609,\n\
\ \"acc_norm\": 0.28078817733990147,\n \"acc_norm_stderr\": 0.03161856335358609\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \"acc_norm\"\
: 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.23030303030303031,\n \"acc_stderr\": 0.03287666758603489,\n\
\ \"acc_norm\": 0.23030303030303031,\n \"acc_norm_stderr\": 0.03287666758603489\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.35353535353535354,\n \"acc_stderr\": 0.03406086723547153,\n \"\
acc_norm\": 0.35353535353535354,\n \"acc_norm_stderr\": 0.03406086723547153\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.36787564766839376,\n \"acc_stderr\": 0.03480175668466036,\n\
\ \"acc_norm\": 0.36787564766839376,\n \"acc_norm_stderr\": 0.03480175668466036\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3641025641025641,\n \"acc_stderr\": 0.02439667298509477,\n \
\ \"acc_norm\": 0.3641025641025641,\n \"acc_norm_stderr\": 0.02439667298509477\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3445378151260504,\n \"acc_stderr\": 0.030868682604121633,\n\
\ \"acc_norm\": 0.3445378151260504,\n \"acc_norm_stderr\": 0.030868682604121633\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3394495412844037,\n \"acc_stderr\": 0.02030210934266235,\n \"\
acc_norm\": 0.3394495412844037,\n \"acc_norm_stderr\": 0.02030210934266235\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4583333333333333,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2696078431372549,\n \"acc_stderr\": 0.03114557065948678,\n \"\
acc_norm\": 0.2696078431372549,\n \"acc_norm_stderr\": 0.03114557065948678\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.21518987341772153,\n \"acc_stderr\": 0.02675082699467617,\n \
\ \"acc_norm\": 0.21518987341772153,\n \"acc_norm_stderr\": 0.02675082699467617\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.13901345291479822,\n\
\ \"acc_stderr\": 0.023219352834474464,\n \"acc_norm\": 0.13901345291479822,\n\
\ \"acc_norm_stderr\": 0.023219352834474464\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2824427480916031,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.2824427480916031,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.14049586776859505,\n \"acc_stderr\": 0.03172233426002161,\n \"\
acc_norm\": 0.14049586776859505,\n \"acc_norm_stderr\": 0.03172233426002161\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.21296296296296297,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22699386503067484,\n \"acc_stderr\": 0.032910995786157686,\n\
\ \"acc_norm\": 0.22699386503067484,\n \"acc_norm_stderr\": 0.032910995786157686\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.24107142857142858,\n\
\ \"acc_stderr\": 0.04059867246952688,\n \"acc_norm\": 0.24107142857142858,\n\
\ \"acc_norm_stderr\": 0.04059867246952688\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3786407766990291,\n \"acc_stderr\": 0.04802694698258972,\n\
\ \"acc_norm\": 0.3786407766990291,\n \"acc_norm_stderr\": 0.04802694698258972\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.21367521367521367,\n\
\ \"acc_stderr\": 0.026853450377009154,\n \"acc_norm\": 0.21367521367521367,\n\
\ \"acc_norm_stderr\": 0.026853450377009154\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909281,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909281\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.20561941251596424,\n\
\ \"acc_stderr\": 0.014452500456785825,\n \"acc_norm\": 0.20561941251596424,\n\
\ \"acc_norm_stderr\": 0.014452500456785825\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2254335260115607,\n \"acc_stderr\": 0.022497230190967547,\n\
\ \"acc_norm\": 0.2254335260115607,\n \"acc_norm_stderr\": 0.022497230190967547\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n\
\ \"acc_stderr\": 0.014893391735249588,\n \"acc_norm\": 0.27262569832402234,\n\
\ \"acc_norm_stderr\": 0.014893391735249588\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2875816993464052,\n \"acc_stderr\": 0.02591780611714716,\n\
\ \"acc_norm\": 0.2875816993464052,\n \"acc_norm_stderr\": 0.02591780611714716\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.24115755627009647,\n\
\ \"acc_stderr\": 0.024296594034763426,\n \"acc_norm\": 0.24115755627009647,\n\
\ \"acc_norm_stderr\": 0.024296594034763426\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.22530864197530864,\n \"acc_stderr\": 0.023246202647819746,\n\
\ \"acc_norm\": 0.22530864197530864,\n \"acc_norm_stderr\": 0.023246202647819746\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24113475177304963,\n \"acc_stderr\": 0.025518731049537762,\n \
\ \"acc_norm\": 0.24113475177304963,\n \"acc_norm_stderr\": 0.025518731049537762\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23468057366362452,\n\
\ \"acc_stderr\": 0.010824026872449358,\n \"acc_norm\": 0.23468057366362452,\n\
\ \"acc_norm_stderr\": 0.010824026872449358\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.41544117647058826,\n \"acc_stderr\": 0.029935342707877757,\n\
\ \"acc_norm\": 0.41544117647058826,\n \"acc_norm_stderr\": 0.029935342707877757\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.23202614379084968,\n \"acc_stderr\": 0.01707737337785701,\n \
\ \"acc_norm\": 0.23202614379084968,\n \"acc_norm_stderr\": 0.01707737337785701\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2909090909090909,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.2909090909090909,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.39591836734693875,\n \"acc_stderr\": 0.03130802899065686,\n\
\ \"acc_norm\": 0.39591836734693875,\n \"acc_norm_stderr\": 0.03130802899065686\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2736318407960199,\n\
\ \"acc_stderr\": 0.03152439186555402,\n \"acc_norm\": 0.2736318407960199,\n\
\ \"acc_norm_stderr\": 0.03152439186555402\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.21686746987951808,\n\
\ \"acc_stderr\": 0.03208284450356365,\n \"acc_norm\": 0.21686746987951808,\n\
\ \"acc_norm_stderr\": 0.03208284450356365\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.17543859649122806,\n \"acc_stderr\": 0.029170885500727654,\n\
\ \"acc_norm\": 0.17543859649122806,\n \"acc_norm_stderr\": 0.029170885500727654\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2386780905752754,\n\
\ \"mc1_stderr\": 0.014922629695456418,\n \"mc2\": 0.4115839931034755,\n\
\ \"mc2_stderr\": 0.015541548311642976\n }\n}\n```"
repo_url: https://huggingface.co/nicholasKluge/Aira-1B5
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|arc:challenge|25_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hellaswag|10_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T15-54-46.926141.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T15-54-46.926141.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T15-54-46.926141.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T15-54-46.926141.parquet'
- config_name: results
data_files:
- split: 2023_10_09T15_54_46.926141
path:
- results_2023-10-09T15-54-46.926141.parquet
- split: latest
path:
- results_2023-10-09T15-54-46.926141.parquet
---
# Dataset Card for Evaluation run of nicholasKluge/Aira-1B5
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/nicholasKluge/Aira-1B5
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [nicholasKluge/Aira-1B5](https://huggingface.co/nicholasKluge/Aira-1B5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nicholasKluge__Aira-1B5",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-09T15:54:46.926141](https://huggingface.co/datasets/open-llm-leaderboard/details_nicholasKluge__Aira-1B5/blob/main/results_2023-10-09T15-54-46.926141.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2743564373642291,
"acc_stderr": 0.03211959266297477,
"acc_norm": 0.27587655832273894,
"acc_norm_stderr": 0.0321270755179637,
"mc1": 0.2386780905752754,
"mc1_stderr": 0.014922629695456418,
"mc2": 0.4115839931034755,
"mc2_stderr": 0.015541548311642976
},
"harness|arc:challenge|25": {
"acc": 0.2687713310580205,
"acc_stderr": 0.01295506596371069,
"acc_norm": 0.28924914675767915,
"acc_norm_stderr": 0.013250012579393443
},
"harness|hellaswag|10": {
"acc": 0.36188010356502687,
"acc_stderr": 0.004795622757327151,
"acc_norm": 0.43108942441744674,
"acc_norm_stderr": 0.004942164585991465
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.035914440841969694,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.035914440841969694
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.29605263157894735,
"acc_stderr": 0.037150621549989056,
"acc_norm": 0.29605263157894735,
"acc_norm_stderr": 0.037150621549989056
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3018867924528302,
"acc_stderr": 0.028254200344438665,
"acc_norm": 0.3018867924528302,
"acc_norm_stderr": 0.028254200344438665
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.32947976878612717,
"acc_stderr": 0.03583901754736411,
"acc_norm": 0.32947976878612717,
"acc_norm_stderr": 0.03583901754736411
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082633,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082633
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.20851063829787234,
"acc_stderr": 0.026556982117838728,
"acc_norm": 0.20851063829787234,
"acc_norm_stderr": 0.026556982117838728
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813344,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813344
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.26455026455026454,
"acc_stderr": 0.022717467897708604,
"acc_norm": 0.26455026455026454,
"acc_norm_stderr": 0.022717467897708604
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24193548387096775,
"acc_stderr": 0.0243625996930311,
"acc_norm": 0.24193548387096775,
"acc_norm_stderr": 0.0243625996930311
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.28078817733990147,
"acc_stderr": 0.03161856335358609,
"acc_norm": 0.28078817733990147,
"acc_norm_stderr": 0.03161856335358609
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23030303030303031,
"acc_stderr": 0.03287666758603489,
"acc_norm": 0.23030303030303031,
"acc_norm_stderr": 0.03287666758603489
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.35353535353535354,
"acc_stderr": 0.03406086723547153,
"acc_norm": 0.35353535353535354,
"acc_norm_stderr": 0.03406086723547153
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.36787564766839376,
"acc_stderr": 0.03480175668466036,
"acc_norm": 0.36787564766839376,
"acc_norm_stderr": 0.03480175668466036
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3641025641025641,
"acc_stderr": 0.02439667298509477,
"acc_norm": 0.3641025641025641,
"acc_norm_stderr": 0.02439667298509477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833706,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833706
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3445378151260504,
"acc_stderr": 0.030868682604121633,
"acc_norm": 0.3445378151260504,
"acc_norm_stderr": 0.030868682604121633
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3394495412844037,
"acc_stderr": 0.02030210934266235,
"acc_norm": 0.3394495412844037,
"acc_norm_stderr": 0.02030210934266235
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2696078431372549,
"acc_stderr": 0.03114557065948678,
"acc_norm": 0.2696078431372549,
"acc_norm_stderr": 0.03114557065948678
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.21518987341772153,
"acc_stderr": 0.02675082699467617,
"acc_norm": 0.21518987341772153,
"acc_norm_stderr": 0.02675082699467617
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.13901345291479822,
"acc_stderr": 0.023219352834474464,
"acc_norm": 0.13901345291479822,
"acc_norm_stderr": 0.023219352834474464
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2824427480916031,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.2824427480916031,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.14049586776859505,
"acc_stderr": 0.03172233426002161,
"acc_norm": 0.14049586776859505,
"acc_norm_stderr": 0.03172233426002161
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22699386503067484,
"acc_stderr": 0.032910995786157686,
"acc_norm": 0.22699386503067484,
"acc_norm_stderr": 0.032910995786157686
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.24107142857142858,
"acc_stderr": 0.04059867246952688,
"acc_norm": 0.24107142857142858,
"acc_norm_stderr": 0.04059867246952688
},
"harness|hendrycksTest-management|5": {
"acc": 0.3786407766990291,
"acc_stderr": 0.04802694698258972,
"acc_norm": 0.3786407766990291,
"acc_norm_stderr": 0.04802694698258972
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.21367521367521367,
"acc_stderr": 0.026853450377009154,
"acc_norm": 0.21367521367521367,
"acc_norm_stderr": 0.026853450377009154
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.20561941251596424,
"acc_stderr": 0.014452500456785825,
"acc_norm": 0.20561941251596424,
"acc_norm_stderr": 0.014452500456785825
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.022497230190967547,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.022497230190967547
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27262569832402234,
"acc_stderr": 0.014893391735249588,
"acc_norm": 0.27262569832402234,
"acc_norm_stderr": 0.014893391735249588
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2875816993464052,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.2875816993464052,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.24115755627009647,
"acc_stderr": 0.024296594034763426,
"acc_norm": 0.24115755627009647,
"acc_norm_stderr": 0.024296594034763426
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.22530864197530864,
"acc_stderr": 0.023246202647819746,
"acc_norm": 0.22530864197530864,
"acc_norm_stderr": 0.023246202647819746
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24113475177304963,
"acc_stderr": 0.025518731049537762,
"acc_norm": 0.24113475177304963,
"acc_norm_stderr": 0.025518731049537762
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23468057366362452,
"acc_stderr": 0.010824026872449358,
"acc_norm": 0.23468057366362452,
"acc_norm_stderr": 0.010824026872449358
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.41544117647058826,
"acc_stderr": 0.029935342707877757,
"acc_norm": 0.41544117647058826,
"acc_norm_stderr": 0.029935342707877757
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.23202614379084968,
"acc_stderr": 0.01707737337785701,
"acc_norm": 0.23202614379084968,
"acc_norm_stderr": 0.01707737337785701
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2909090909090909,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.2909090909090909,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.39591836734693875,
"acc_stderr": 0.03130802899065686,
"acc_norm": 0.39591836734693875,
"acc_norm_stderr": 0.03130802899065686
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2736318407960199,
"acc_stderr": 0.03152439186555402,
"acc_norm": 0.2736318407960199,
"acc_norm_stderr": 0.03152439186555402
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-virology|5": {
"acc": 0.21686746987951808,
"acc_stderr": 0.03208284450356365,
"acc_norm": 0.21686746987951808,
"acc_norm_stderr": 0.03208284450356365
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.17543859649122806,
"acc_stderr": 0.029170885500727654,
"acc_norm": 0.17543859649122806,
"acc_norm_stderr": 0.029170885500727654
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2386780905752754,
"mc1_stderr": 0.014922629695456418,
"mc2": 0.4115839931034755,
"mc2_stderr": 0.015541548311642976
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 64,999 | [
[
-0.049530029296875,
-0.057464599609375,
0.0185394287109375,
0.01361846923828125,
-0.00962066650390625,
-0.00423431396484375,
0.0021686553955078125,
-0.014678955078125,
0.04052734375,
-0.006664276123046875,
-0.034423828125,
-0.046112060546875,
-0.0307159423828125... |
open-llm-leaderboard/details_caisarl76__Mistral-7B-guanaco1k-ep2 | 2023-10-25T04:08:32.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-09T15:58:17 | ---
pretty_name: Evaluation run of caisarl76/Mistral-7B-guanaco1k-ep2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [caisarl76/Mistral-7B-guanaco1k-ep2](https://huggingface.co/caisarl76/Mistral-7B-guanaco1k-ep2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_caisarl76__Mistral-7B-guanaco1k-ep2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-25T04:08:20.324415](https://huggingface.co/datasets/open-llm-leaderboard/details_caisarl76__Mistral-7B-guanaco1k-ep2/blob/main/results_2023-10-25T04-08-20.324415.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.002307046979865772,\n\
\ \"em_stderr\": 0.0004913221265094507,\n \"f1\": 0.06542994966442944,\n\
\ \"f1_stderr\": 0.001488633695023099,\n \"acc\": 0.4501858873976542,\n\
\ \"acc_stderr\": 0.010287740882080417\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.002307046979865772,\n \"em_stderr\": 0.0004913221265094507,\n\
\ \"f1\": 0.06542994966442944,\n \"f1_stderr\": 0.001488633695023099\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1197877179681577,\n \
\ \"acc_stderr\": 0.008944213403553055\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7805840568271507,\n \"acc_stderr\": 0.01163126836060778\n\
\ }\n}\n```"
repo_url: https://huggingface.co/caisarl76/Mistral-7B-guanaco1k-ep2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|arc:challenge|25_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_25T04_08_20.324415
path:
- '**/details_harness|drop|3_2023-10-25T04-08-20.324415.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-25T04-08-20.324415.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_25T04_08_20.324415
path:
- '**/details_harness|gsm8k|5_2023-10-25T04-08-20.324415.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-25T04-08-20.324415.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hellaswag|10_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T15-57-53.203212.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T15-57-53.203212.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T15-57-53.203212.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_25T04_08_20.324415
path:
- '**/details_harness|winogrande|5_2023-10-25T04-08-20.324415.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-25T04-08-20.324415.parquet'
- config_name: results
data_files:
- split: 2023_10_09T15_57_53.203212
path:
- results_2023-10-09T15-57-53.203212.parquet
- split: 2023_10_25T04_08_20.324415
path:
- results_2023-10-25T04-08-20.324415.parquet
- split: latest
path:
- results_2023-10-25T04-08-20.324415.parquet
---
# Dataset Card for Evaluation run of caisarl76/Mistral-7B-guanaco1k-ep2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/caisarl76/Mistral-7B-guanaco1k-ep2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [caisarl76/Mistral-7B-guanaco1k-ep2](https://huggingface.co/caisarl76/Mistral-7B-guanaco1k-ep2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_caisarl76__Mistral-7B-guanaco1k-ep2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-25T04:08:20.324415](https://huggingface.co/datasets/open-llm-leaderboard/details_caisarl76__Mistral-7B-guanaco1k-ep2/blob/main/results_2023-10-25T04-08-20.324415.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.002307046979865772,
"em_stderr": 0.0004913221265094507,
"f1": 0.06542994966442944,
"f1_stderr": 0.001488633695023099,
"acc": 0.4501858873976542,
"acc_stderr": 0.010287740882080417
},
"harness|drop|3": {
"em": 0.002307046979865772,
"em_stderr": 0.0004913221265094507,
"f1": 0.06542994966442944,
"f1_stderr": 0.001488633695023099
},
"harness|gsm8k|5": {
"acc": 0.1197877179681577,
"acc_stderr": 0.008944213403553055
},
"harness|winogrande|5": {
"acc": 0.7805840568271507,
"acc_stderr": 0.01163126836060778
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,724 | [
[
-0.027252197265625,
-0.04534912109375,
0.01215362548828125,
0.0196380615234375,
-0.01380157470703125,
0.003414154052734375,
-0.02288818359375,
-0.0101776123046875,
0.025787353515625,
0.039154052734375,
-0.0494384765625,
-0.07110595703125,
-0.0501708984375,
0... |
Autoceres/Agricorp | 2023-10-09T16:31:40.000Z | [
"region:us"
] | Autoceres | null | null | 0 | 0 | 2023-10-09T16:02:14 | Agricorp Dataset
The AutoCeres dataset comprises a collection of images captured from various sources and cultivation locations. It encompasses the following crops:
Corn
Soybean
Rice
Onion
Each crop category is associated with a set of images, and for further analysis and segmentation tasks, masks corresponding to these crops are also included. This dataset serves as a valuable resource for the development and training of computer vision algorithms in the agricultural domain. | 483 | [
[
-0.034393310546875,
-0.0035800933837890625,
0.003429412841796875,
-0.0019512176513671875,
0.023284912109375,
0.0212554931640625,
0.020416259765625,
-0.045013427734375,
0.017852783203125,
0.048553466796875,
-0.0251007080078125,
-0.054168701171875,
-0.0556640625,
... |
s-lab/images | 2023-10-09T16:39:46.000Z | [
"region:us"
] | s-lab | null | null | 0 | 0 | 2023-10-09T16:06:31 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
SaiGirish/ddpm-butterflies-128 | 2023-10-09T16:06:39.000Z | [
"region:us"
] | SaiGirish | null | null | 0 | 0 | 2023-10-09T16:06:39 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
JennyZZZ/guanaco-llama2-1k | 2023-10-11T16:42:14.000Z | [
"region:us"
] | JennyZZZ | null | null | 0 | 0 | 2023-10-09T16:22:36 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 15401731
num_examples: 9846
- name: test
num_bytes: 815439
num_examples: 518
download_size: 0
dataset_size: 16217170
---
# Dataset Card for "guanaco-llama2-1k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 538 | [
[
-0.02203369140625,
-0.01282501220703125,
0.017364501953125,
0.037689208984375,
-0.03839111328125,
0.0008511543273925781,
0.0258941650390625,
-0.0190887451171875,
0.06463623046875,
0.029876708984375,
-0.05474853515625,
-0.06707763671875,
-0.050262451171875,
-... |
ostapeno/ds | 2023-10-09T16:54:04.000Z | [
"region:us"
] | ostapeno | null | null | 0 | 0 | 2023-10-09T16:53:06 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: context
dtype: string
- name: subject
dtype: string
- name: response
dtype: string
- name: author_instr
dtype: string
- name: inst_index_for_context
dtype: 'null'
- name: author_response
dtype: string
- name: normalized_cumul_logprob_response
dtype: float64
splits:
- name: train
num_bytes: 324551830
num_examples: 78955
download_size: 94548965
dataset_size: 324551830
---
# Dataset Card for "ds"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 665 | [
[
-0.049713134765625,
-0.0191650390625,
0.0311126708984375,
0.0031261444091796875,
-0.0170440673828125,
0.004352569580078125,
0.0260467529296875,
-0.009307861328125,
0.07489013671875,
0.0380859375,
-0.07366943359375,
-0.05328369140625,
-0.054595947265625,
-0.0... |
bofuchen/meimei | 2023-10-09T16:53:19.000Z | [
"region:us"
] | bofuchen | null | null | 0 | 0 | 2023-10-09T16:53:19 | Entry not found | 15 | [
[
-0.021392822265625,
-0.01494598388671875,
0.05718994140625,
0.028839111328125,
-0.0350341796875,
0.046539306640625,
0.052490234375,
0.00507354736328125,
0.051361083984375,
0.01702880859375,
-0.052093505859375,
-0.01494598388671875,
-0.06036376953125,
0.03790... |
SohamNale/Banking_Dataset_for_LLM_Finetuning | 2023-10-09T17:05:59.000Z | [
"region:us"
] | SohamNale | null | null | 0 | 0 | 2023-10-09T17:05:04 | Entry not found | 15 | [
[
-0.0213775634765625,
-0.01494598388671875,
0.057159423828125,
0.02880859375,
-0.0350341796875,
0.046478271484375,
0.052520751953125,
0.005077362060546875,
0.051361083984375,
0.0170135498046875,
-0.05206298828125,
-0.01494598388671875,
-0.06036376953125,
0.03... |
sleepyboyeyes/Acoustic | 2023-10-17T18:23:04.000Z | [
"region:us"
] | sleepyboyeyes | null | null | 0 | 0 | 2023-10-09T17:19:54 | Entry not found | 15 | [
[
-0.0213775634765625,
-0.01494598388671875,
0.057159423828125,
0.02880859375,
-0.0350341796875,
0.046478271484375,
0.052520751953125,
0.005077362060546875,
0.051361083984375,
0.0170135498046875,
-0.05206298828125,
-0.01494598388671875,
-0.06036376953125,
0.03... |
autoevaluate/autoeval-eval-acronym_identification-default-e52b53-94025145972 | 2023-10-09T17:39:37.000Z | [
"region:us"
] | autoevaluate | null | null | 0 | 0 | 2023-10-09T17:39:32 | Entry not found | 15 | [
[
-0.0213775634765625,
-0.01494598388671875,
0.057159423828125,
0.02880859375,
-0.0350341796875,
0.046478271484375,
0.052520751953125,
0.005077362060546875,
0.051361083984375,
0.0170135498046875,
-0.05206298828125,
-0.01494598388671875,
-0.06036376953125,
0.03... |
peter-h-o-r-v/autocast-initiative | 2023-10-09T18:31:55.000Z | [
"license:artistic-2.0",
"art",
"sound",
"podcast",
"podcasting",
"region:us"
] | peter-h-o-r-v | null | null | 0 | 0 | 2023-10-09T17:45:03 | ---
license: artistic-2.0
pretty_name: The Autocast Initiative
tags:
- art
- sound
- podcast
- podcasting
---
# The Autocast Initiative
This dataset archives podcasts in real-time. Podcasts that indentify with the principle of autocasting as their method for sharing audiofiles with an audience of subsribers.
All contributers are volonteers.
## The Principles Autocasting
* The content is primarily not created.
* Neither the files or the RSS feed is not manipulated after publish, other than to correct mistakes.
* * The "episode description" is the exception to the above. Use this field however you please.
* No method is to be considered "too low-effort" when it comes to generating audiofiles.
* For content protected by monetization is encouraged to commit scrambled content and provide means for unscrambling as they see fit.
* * Further monetization is encouraged.
* Get paid if you can.
## How to contribute
Create a folder for your autocast as so:
```
/archive/[Name of your feed]/
```
Do not substitute special characters (if possible)
In this folder, include your episodes as well as snapshots of your RSS feed at the time of publish (if possible)
```
/archive/[Name of your feed]/[001].mp3 // or whichever format you use
/archive/[Name of your feed]/[001].xml
/archive/[Name of your feed]/[002].mp3 // ...
/archive/[Name of your feed]/[002].xml
...
/archive/[Name of your feed]/[00n].mp3 // ...
/archive/[Name of your feed]/[00n].xml
...
```
If you intend to publish more than 1000 episodes in a single feed, figure it out (responsibly) | 1,569 | [
[
-0.034149169921875,
0.01488494873046875,
0.0120697021484375,
0.043975830078125,
-0.0179290771484375,
0.00809478759765625,
-0.00423431396484375,
-0.01433563232421875,
0.045562744140625,
0.028961181640625,
-0.0831298828125,
-0.0296630859375,
-0.060791015625,
0... |
hmao/rule_gen_splunk | 2023-10-09T18:43:48.000Z | [
"region:us"
] | hmao | null | null | 0 | 0 | 2023-10-09T18:42:26 | ---
dataset_info:
features:
- name: instruction
dtype: 'null'
- name: rule
dtype: 'null'
- name: software
dtype: 'null'
- name: configuration
dtype: 'null'
- name: description
dtype: 'null'
splits:
- name: train
num_bytes: 0
num_examples: 0
download_size: 1376
dataset_size: 0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "rule_gen_splunk"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 584 | [
[
-0.0272216796875,
-0.0189361572265625,
0.010711669921875,
0.036773681640625,
-0.0364990234375,
-0.017974853515625,
0.019775390625,
-0.002918243408203125,
0.07342529296875,
0.03265380859375,
-0.06597900390625,
-0.0599365234375,
-0.032806396484375,
-0.00803375... |
open-llm-leaderboard/details_v2ray__LLaMA-2-Jannie-70B-QLoRA | 2023-10-09T18:55:57.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-09T18:55:49 | ---
pretty_name: Evaluation run of v2ray/LLaMA-2-Jannie-70B-QLoRA
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [v2ray/LLaMA-2-Jannie-70B-QLoRA](https://huggingface.co/v2ray/LLaMA-2-Jannie-70B-QLoRA)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_v2ray__LLaMA-2-Jannie-70B-QLoRA\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-09T18:55:45.725131](https://huggingface.co/datasets/open-llm-leaderboard/details_v2ray__LLaMA-2-Jannie-70B-QLoRA/blob/main/results_2023-10-09T18-55-45.725131.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.5506501677852349,\n\
\ \"em_stderr\": 0.0050941277409732805,\n \"f1\": 0.5974674916107394,\n\
\ \"f1_stderr\": 0.004813528422862131,\n \"acc\": 0.5735917227001633,\n\
\ \"acc_stderr\": 0.011696543872157381\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.5506501677852349,\n \"em_stderr\": 0.0050941277409732805,\n\
\ \"f1\": 0.5974674916107394,\n \"f1_stderr\": 0.004813528422862131\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.31766489764973466,\n \
\ \"acc_stderr\": 0.012824066621488854\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.829518547750592,\n \"acc_stderr\": 0.010569021122825909\n\
\ }\n}\n```"
repo_url: https://huggingface.co/v2ray/LLaMA-2-Jannie-70B-QLoRA
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_10_09T18_55_45.725131
path:
- '**/details_harness|drop|3_2023-10-09T18-55-45.725131.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-09T18-55-45.725131.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_09T18_55_45.725131
path:
- '**/details_harness|gsm8k|5_2023-10-09T18-55-45.725131.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-09T18-55-45.725131.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_09T18_55_45.725131
path:
- '**/details_harness|winogrande|5_2023-10-09T18-55-45.725131.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-09T18-55-45.725131.parquet'
- config_name: results
data_files:
- split: 2023_10_09T18_55_45.725131
path:
- results_2023-10-09T18-55-45.725131.parquet
- split: latest
path:
- results_2023-10-09T18-55-45.725131.parquet
---
# Dataset Card for Evaluation run of v2ray/LLaMA-2-Jannie-70B-QLoRA
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/v2ray/LLaMA-2-Jannie-70B-QLoRA
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [v2ray/LLaMA-2-Jannie-70B-QLoRA](https://huggingface.co/v2ray/LLaMA-2-Jannie-70B-QLoRA) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_v2ray__LLaMA-2-Jannie-70B-QLoRA",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-09T18:55:45.725131](https://huggingface.co/datasets/open-llm-leaderboard/details_v2ray__LLaMA-2-Jannie-70B-QLoRA/blob/main/results_2023-10-09T18-55-45.725131.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.5506501677852349,
"em_stderr": 0.0050941277409732805,
"f1": 0.5974674916107394,
"f1_stderr": 0.004813528422862131,
"acc": 0.5735917227001633,
"acc_stderr": 0.011696543872157381
},
"harness|drop|3": {
"em": 0.5506501677852349,
"em_stderr": 0.0050941277409732805,
"f1": 0.5974674916107394,
"f1_stderr": 0.004813528422862131
},
"harness|gsm8k|5": {
"acc": 0.31766489764973466,
"acc_stderr": 0.012824066621488854
},
"harness|winogrande|5": {
"acc": 0.829518547750592,
"acc_stderr": 0.010569021122825909
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 7,257 | [
[
-0.02239990234375,
-0.04278564453125,
0.019073486328125,
0.01446533203125,
-0.0184173583984375,
0.00888824462890625,
-0.02020263671875,
-0.0183258056640625,
0.03277587890625,
0.041351318359375,
-0.0469970703125,
-0.0692138671875,
-0.051483154296875,
0.009269... |
jony4583/ddetr-cvpdl | 2023-10-09T19:16:31.000Z | [
"region:us"
] | jony4583 | null | null | 0 | 0 | 2023-10-09T19:16:31 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
Sharka/savve_dsjs | 2023-10-09T19:33:02.000Z | [
"region:us"
] | Sharka | null | null | 0 | 0 | 2023-10-09T19:20:03 | Entry not found | 15 | [
[
-0.021392822265625,
-0.01494598388671875,
0.05718994140625,
0.028839111328125,
-0.0350341796875,
0.046539306640625,
0.052490234375,
0.00507354736328125,
0.051361083984375,
0.01702880859375,
-0.052093505859375,
-0.01494598388671875,
-0.06036376953125,
0.03790... |
open-llm-leaderboard/details_bhenrym14__mistral-7b-platypus-fp16 | 2023-10-29T09:15:36.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-09T19:22:37 | ---
pretty_name: Evaluation run of bhenrym14/mistral-7b-platypus-fp16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [bhenrym14/mistral-7b-platypus-fp16](https://huggingface.co/bhenrym14/mistral-7b-platypus-fp16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bhenrym14__mistral-7b-platypus-fp16\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-29T09:15:23.830857](https://huggingface.co/datasets/open-llm-leaderboard/details_bhenrym14__mistral-7b-platypus-fp16/blob/main/results_2023-10-29T09-15-23.830857.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.4168414429530201,\n\
\ \"em_stderr\": 0.005049151744527279,\n \"f1\": 0.4591768036912757,\n\
\ \"f1_stderr\": 0.0048851694906548275,\n \"acc\": 0.479468014382712,\n\
\ \"acc_stderr\": 0.010986687977801515\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.4168414429530201,\n \"em_stderr\": 0.005049151744527279,\n\
\ \"f1\": 0.4591768036912757,\n \"f1_stderr\": 0.0048851694906548275\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.17361637604245642,\n \
\ \"acc_stderr\": 0.010433463221257632\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7853196527229677,\n \"acc_stderr\": 0.011539912734345398\n\
\ }\n}\n```"
repo_url: https://huggingface.co/bhenrym14/mistral-7b-platypus-fp16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|arc:challenge|25_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_29T09_15_23.830857
path:
- '**/details_harness|drop|3_2023-10-29T09-15-23.830857.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-29T09-15-23.830857.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_29T09_15_23.830857
path:
- '**/details_harness|gsm8k|5_2023-10-29T09-15-23.830857.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-29T09-15-23.830857.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hellaswag|10_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T19-22-13.143311.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T19-22-13.143311.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T19-22-13.143311.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_29T09_15_23.830857
path:
- '**/details_harness|winogrande|5_2023-10-29T09-15-23.830857.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-29T09-15-23.830857.parquet'
- config_name: results
data_files:
- split: 2023_10_09T19_22_13.143311
path:
- results_2023-10-09T19-22-13.143311.parquet
- split: 2023_10_29T09_15_23.830857
path:
- results_2023-10-29T09-15-23.830857.parquet
- split: latest
path:
- results_2023-10-29T09-15-23.830857.parquet
---
# Dataset Card for Evaluation run of bhenrym14/mistral-7b-platypus-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/bhenrym14/mistral-7b-platypus-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [bhenrym14/mistral-7b-platypus-fp16](https://huggingface.co/bhenrym14/mistral-7b-platypus-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bhenrym14__mistral-7b-platypus-fp16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-29T09:15:23.830857](https://huggingface.co/datasets/open-llm-leaderboard/details_bhenrym14__mistral-7b-platypus-fp16/blob/main/results_2023-10-29T09-15-23.830857.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.4168414429530201,
"em_stderr": 0.005049151744527279,
"f1": 0.4591768036912757,
"f1_stderr": 0.0048851694906548275,
"acc": 0.479468014382712,
"acc_stderr": 0.010986687977801515
},
"harness|drop|3": {
"em": 0.4168414429530201,
"em_stderr": 0.005049151744527279,
"f1": 0.4591768036912757,
"f1_stderr": 0.0048851694906548275
},
"harness|gsm8k|5": {
"acc": 0.17361637604245642,
"acc_stderr": 0.010433463221257632
},
"harness|winogrande|5": {
"acc": 0.7853196527229677,
"acc_stderr": 0.011539912734345398
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,714 | [
[
-0.0302886962890625,
-0.045745849609375,
0.016876220703125,
0.020843505859375,
-0.013458251953125,
0.005214691162109375,
-0.032440185546875,
-0.01210784912109375,
0.023101806640625,
0.03778076171875,
-0.0511474609375,
-0.06610107421875,
-0.0521240234375,
0.0... |
sinagph/nuph-sft | 2023-10-14T16:07:28.000Z | [
"region:us"
] | sinagph | null | null | 0 | 0 | 2023-10-09T19:22:49 | Entry not found | 15 | [
[
-0.021392822265625,
-0.01494598388671875,
0.05718994140625,
0.028839111328125,
-0.0350341796875,
0.046539306640625,
0.052490234375,
0.00507354736328125,
0.051361083984375,
0.01702880859375,
-0.052093505859375,
-0.01494598388671875,
-0.06036376953125,
0.03790... |
muryshev/saiga-chat | 2023-10-26T15:04:55.000Z | [
"region:us"
] | muryshev | null | null | 0 | 0 | 2023-10-09T19:27:18 | Entry not found | 15 | [
[
-0.021392822265625,
-0.01494598388671875,
0.05718994140625,
0.028839111328125,
-0.0350341796875,
0.046539306640625,
0.052490234375,
0.00507354736328125,
0.051361083984375,
0.01702880859375,
-0.052093505859375,
-0.01494598388671875,
-0.06036376953125,
0.03790... |
ContextualAI/tiny-trivia_qa | 2023-10-09T19:42:17.000Z | [
"region:us"
] | ContextualAI | null | null | 0 | 0 | 2023-10-09T19:42:13 | ---
dataset_info:
features:
- name: target
dtype: string
- name: query
dtype: string
- name: gold_generation
sequence: string
splits:
- name: dev
num_bytes: 34332
num_examples: 100
download_size: 24000
dataset_size: 34332
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
---
# Dataset Card for "tiny-trivia_qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 512 | [
[
-0.0399169921875,
-0.025421142578125,
0.036834716796875,
0.002704620361328125,
-0.022796630859375,
0.00722503662109375,
0.020050048828125,
-0.003086090087890625,
0.07080078125,
0.0169830322265625,
-0.0478515625,
-0.043609619140625,
-0.0142822265625,
-0.00976... |
Sharka/CIVQA_easyocr_encode_train | 2023-10-09T20:57:19.000Z | [
"region:us"
] | Sharka | null | null | 0 | 0 | 2023-10-09T20:29:57 | Entry not found | 15 | [
[
-0.0214080810546875,
-0.01497650146484375,
0.05718994140625,
0.028839111328125,
-0.0350341796875,
0.046539306640625,
0.052490234375,
0.00507354736328125,
0.051361083984375,
0.016998291015625,
-0.05206298828125,
-0.01496124267578125,
-0.06036376953125,
0.0379... |
ricahrd/duduss | 2023-10-09T20:30:24.000Z | [
"region:us"
] | ricahrd | null | null | 0 | 0 | 2023-10-09T20:30:24 | Entry not found | 15 | [
[
-0.0214080810546875,
-0.01497650146484375,
0.05718994140625,
0.028839111328125,
-0.0350341796875,
0.046539306640625,
0.052490234375,
0.00507354736328125,
0.051361083984375,
0.016998291015625,
-0.05206298828125,
-0.01496124267578125,
-0.06036376953125,
0.0379... |
fimu-docproc-research/CIVQA_easyocr_encode_train | 2023-10-09T20:54:46.000Z | [
"region:us"
] | fimu-docproc-research | null | null | 0 | 0 | 2023-10-09T20:44:46 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
ostapeno/cot | 2023-10-09T21:04:34.000Z | [
"region:us"
] | ostapeno | null | null | 0 | 0 | 2023-10-09T21:04:29 | ---
dataset_info:
features:
- name: dataset
dtype: string
- name: id
dtype: string
- name: messages
list:
- name: role
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 115613738
num_examples: 100000
download_size: 52113324
dataset_size: 115613738
---
# Dataset Card for "cot"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 492 | [
[
-0.049407958984375,
-0.01372528076171875,
0.02264404296875,
0.0202178955078125,
-0.0222930908203125,
0.01349639892578125,
0.01543426513671875,
-0.0174560546875,
0.047027587890625,
0.041534423828125,
-0.0521240234375,
-0.0684814453125,
-0.0498046875,
-0.01864... |
ostapeno/oasst1 | 2023-10-09T21:04:39.000Z | [
"region:us"
] | ostapeno | null | null | 0 | 0 | 2023-10-09T21:04:37 | ---
dataset_info:
features:
- name: dataset
dtype: string
- name: id
dtype: string
- name: messages
list:
- name: role
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 51422776
num_examples: 33919
download_size: 20867411
dataset_size: 51422776
---
# Dataset Card for "oasst1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 492 | [
[
-0.0301666259765625,
-0.0189971923828125,
0.01824951171875,
-0.0003066062927246094,
-0.01528167724609375,
-0.00843048095703125,
0.043701171875,
-0.01016998291015625,
0.060028076171875,
0.0288848876953125,
-0.0645751953125,
-0.051513671875,
-0.043701171875,
-... |
ostapeno/stanford_alpaca | 2023-10-09T21:05:00.000Z | [
"region:us"
] | ostapeno | null | null | 0 | 0 | 2023-10-09T21:04:58 | ---
dataset_info:
features:
- name: dataset
dtype: string
- name: id
dtype: string
- name: messages
list:
- name: role
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 23769688
num_examples: 52002
download_size: 12254044
dataset_size: 23769688
---
# Dataset Card for "stanford_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 501 | [
[
-0.048004150390625,
-0.032012939453125,
0.01495361328125,
0.031524658203125,
-0.0232391357421875,
-0.0205841064453125,
0.0272064208984375,
-0.022796630859375,
0.06903076171875,
0.035980224609375,
-0.06256103515625,
-0.058929443359375,
-0.040130615234375,
-0.... |
ostapeno/self_instruct | 2023-10-09T21:05:03.000Z | [
"region:us"
] | ostapeno | null | null | 0 | 0 | 2023-10-09T21:05:00 | ---
dataset_info:
features:
- name: dataset
dtype: string
- name: id
dtype: string
- name: messages
list:
- name: role
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 27516583
num_examples: 82439
download_size: 11204230
dataset_size: 27516583
---
# Dataset Card for "self_instruct"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 499 | [
[
-0.03594970703125,
-0.01378631591796875,
0.0147552490234375,
0.022125244140625,
0.0017042160034179688,
-0.0059814453125,
0.02423095703125,
-0.004863739013671875,
0.05877685546875,
0.041351318359375,
-0.06292724609375,
-0.04449462890625,
-0.0290069580078125,
... |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.