id stringlengths 2 115 | lastModified stringlengths 24 24 | tags list | author stringlengths 2 42 ⌀ | description stringlengths 0 68.7k ⌀ | citation stringlengths 0 10.7k ⌀ | cardData null | likes int64 0 3.55k | downloads int64 0 10.1M | card stringlengths 0 1.01M |
|---|---|---|---|---|---|---|---|---|---|
open-llm-leaderboard/details_kashif__stack-llama-2 | 2023-09-22T20:19:16.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of kashif/stack-llama-2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [kashif/stack-llama-2](https://huggingface.co/kashif/stack-llama-2) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kashif__stack-llama-2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T20:19:04.146812](https://huggingface.co/datasets/open-llm-leaderboard/details_kashif__stack-llama-2/blob/main/results_2023-09-22T20-19-04.146812.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0012583892617449664,\n\
\ \"em_stderr\": 0.00036305608931194434,\n \"f1\": 0.05443896812080537,\n\
\ \"f1_stderr\": 0.0012685965060744062,\n \"acc\": 0.4202036533620397,\n\
\ \"acc_stderr\": 0.010294487617119145\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.00036305608931194434,\n\
\ \"f1\": 0.05443896812080537,\n \"f1_stderr\": 0.0012685965060744062\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10007581501137225,\n \
\ \"acc_stderr\": 0.008266274528685624\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7403314917127072,\n \"acc_stderr\": 0.012322700705552667\n\
\ }\n}\n```"
repo_url: https://huggingface.co/kashif/stack-llama-2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|arc:challenge|25_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T20_19_04.146812
path:
- '**/details_harness|drop|3_2023-09-22T20-19-04.146812.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T20-19-04.146812.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T20_19_04.146812
path:
- '**/details_harness|gsm8k|5_2023-09-22T20-19-04.146812.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T20-19-04.146812.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hellaswag|10_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T07:07:44.494010.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T07:07:44.494010.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T07:07:44.494010.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T20_19_04.146812
path:
- '**/details_harness|winogrande|5_2023-09-22T20-19-04.146812.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T20-19-04.146812.parquet'
- config_name: results
data_files:
- split: 2023_08_29T07_07_44.494010
path:
- results_2023-08-29T07:07:44.494010.parquet
- split: 2023_09_22T20_19_04.146812
path:
- results_2023-09-22T20-19-04.146812.parquet
- split: latest
path:
- results_2023-09-22T20-19-04.146812.parquet
---
# Dataset Card for Evaluation run of kashif/stack-llama-2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/kashif/stack-llama-2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [kashif/stack-llama-2](https://huggingface.co/kashif/stack-llama-2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kashif__stack-llama-2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T20:19:04.146812](https://huggingface.co/datasets/open-llm-leaderboard/details_kashif__stack-llama-2/blob/main/results_2023-09-22T20-19-04.146812.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0012583892617449664,
"em_stderr": 0.00036305608931194434,
"f1": 0.05443896812080537,
"f1_stderr": 0.0012685965060744062,
"acc": 0.4202036533620397,
"acc_stderr": 0.010294487617119145
},
"harness|drop|3": {
"em": 0.0012583892617449664,
"em_stderr": 0.00036305608931194434,
"f1": 0.05443896812080537,
"f1_stderr": 0.0012685965060744062
},
"harness|gsm8k|5": {
"acc": 0.10007581501137225,
"acc_stderr": 0.008266274528685624
},
"harness|winogrande|5": {
"acc": 0.7403314917127072,
"acc_stderr": 0.012322700705552667
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Bernardelcira/Descargar | 2023-08-29T08:24:02.000Z | [
"region:us"
] | Bernardelcira | null | null | null | 0 | 0 | Entry not found |
abdiharyadi/id_panl_bppt_with_amrbart_amr | 2023-08-29T11:15:01.000Z | [
"region:us"
] | abdiharyadi | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- en
- id
- name: topic
dtype:
class_label:
names:
'0': Economy
'1': International
'2': Science
'3': Sport
- name: amr
dtype: string
splits:
- name: train
num_bytes: 365469
num_examples: 1220
download_size: 170150
dataset_size: 365469
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "id_panl_bppt_with_amrbart_amr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
LinhDuong/fill50k | 2023-08-29T07:24:35.000Z | [
"license:apache-2.0",
"region:us"
] | LinhDuong | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
tyzhu/fwv2_random_num_train_10_eval_10 | 2023-08-29T07:26:45.000Z | [
"region:us"
] | tyzhu | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: train_doc2id
path: data/train_doc2id-*
- split: train_id2doc
path: data/train_id2doc-*
- split: train_find_word
path: data/train_find_word-*
- split: eval_find_word
path: data/eval_find_word-*
- split: id_context_mapping
path: data/id_context_mapping-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 2677
num_examples: 30
- name: train_doc2id
num_bytes: 1651
num_examples: 20
- name: train_id2doc
num_bytes: 1711
num_examples: 20
- name: train_find_word
num_bytes: 966
num_examples: 10
- name: eval_find_word
num_bytes: 974
num_examples: 10
- name: id_context_mapping
num_bytes: 1071
num_examples: 20
download_size: 16570
dataset_size: 9050
---
# Dataset Card for "fwv2_random_num_train_10_eval_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NobodyExistsOnTheInternet/gpt4math5k460token | 2023-08-29T07:28:40.000Z | [
"license:mit",
"region:us"
] | NobodyExistsOnTheInternet | null | null | null | 0 | 0 | ---
license: mit
---
|
Sampoerni/Xmode | 2023-08-29T08:22:39.000Z | [
"region:us"
] | Sampoerni | null | null | null | 0 | 0 | Entry not found |
qnquang/quangnguyen-test-llama2-1k | 2023-08-29T07:39:24.000Z | [
"region:us"
] | qnquang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1408214
num_examples: 1000
download_size: 819674
dataset_size: 1408214
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "quangnguyen-test-llama2-1k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
junlee666/test2 | 2023-08-29T07:53:02.000Z | [
"region:us"
] | junlee666 | null | null | null | 0 | 0 | Entry not found |
zxvix/pubmed_sonnet | 2023-08-29T13:17:20.000Z | [
"region:us"
] | zxvix | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: MedlineCitation
struct:
- name: PMID
dtype: int32
- name: DateCompleted
struct:
- name: Year
dtype: int32
- name: Month
dtype: int32
- name: Day
dtype: int32
- name: NumberOfReferences
dtype: int32
- name: DateRevised
struct:
- name: Year
dtype: int32
- name: Month
dtype: int32
- name: Day
dtype: int32
- name: Article
struct:
- name: Abstract
struct:
- name: AbstractText
dtype: string
- name: ArticleTitle
dtype: string
- name: AuthorList
struct:
- name: Author
sequence:
- name: LastName
dtype: string
- name: ForeName
dtype: string
- name: Initials
dtype: string
- name: CollectiveName
dtype: string
- name: Language
dtype: string
- name: GrantList
struct:
- name: Grant
sequence:
- name: GrantID
dtype: string
- name: Agency
dtype: string
- name: Country
dtype: string
- name: PublicationTypeList
struct:
- name: PublicationType
sequence: string
- name: MedlineJournalInfo
struct:
- name: Country
dtype: string
- name: ChemicalList
struct:
- name: Chemical
sequence:
- name: RegistryNumber
dtype: string
- name: NameOfSubstance
dtype: string
- name: CitationSubset
dtype: string
- name: MeshHeadingList
struct:
- name: MeshHeading
sequence:
- name: DescriptorName
dtype: string
- name: QualifierName
dtype: string
- name: PubmedData
struct:
- name: ArticleIdList
sequence:
- name: ArticleId
sequence: string
- name: PublicationStatus
dtype: string
- name: History
struct:
- name: PubMedPubDate
sequence:
- name: Year
dtype: int32
- name: Month
dtype: int32
- name: Day
dtype: int32
- name: ReferenceList
sequence:
- name: Citation
dtype: string
- name: CitationId
dtype: int32
- name: text
dtype: string
- name: title
dtype: string
- name: original_text
dtype: string
splits:
- name: test
num_bytes: 3712700.992
num_examples: 974
download_size: 2134679
dataset_size: 3712700.992
---
# Dataset Card for "pubmed_sonnet"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zhenganlin/test-dataset | 2023-08-29T07:42:31.000Z | [
"license:openrail",
"region:us"
] | zhenganlin | null | null | null | 0 | 0 | ---
license: openrail
---
|
mickume/fandom_criticalrole | 2023-08-29T09:08:47.000Z | [
"region:us"
] | mickume | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 90421471
num_examples: 1047802
download_size: 55337912
dataset_size: 90421471
---
# Dataset Card for "fandom_criticalrole"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mickume/fandom_harrypotter | 2023-08-29T07:47:50.000Z | [
"region:us"
] | mickume | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 28616083
num_examples: 149535
download_size: 16456108
dataset_size: 28616083
---
# Dataset Card for "fandom_harrypotter"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NobodyExistsOnTheInternet/gpt4mathsub463 | 2023-08-29T07:47:56.000Z | [
"license:mit",
"region:us"
] | NobodyExistsOnTheInternet | null | null | null | 1 | 0 | ---
license: mit
---
|
LahiruLowe/niv2_filtered_3pertask | 2023-08-29T07:50:28.000Z | [
"region:us"
] | LahiruLowe | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: original_index
dtype: int64
- name: inputs
dtype: string
- name: targets
dtype: string
- name: task_source
dtype: string
- name: task_name
dtype: string
- name: template_type
dtype: string
splits:
- name: train
num_bytes: 4509772
num_examples: 4668
download_size: 2486682
dataset_size: 4509772
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "niv2_filtered_3pertask"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
LahiruLowe/t0_filtered_3pertask | 2023-08-29T07:55:54.000Z | [
"region:us"
] | LahiruLowe | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: original_index
dtype: int64
- name: inputs
dtype: string
- name: targets
dtype: string
- name: task_source
dtype: string
- name: task_name
dtype: string
- name: template_type
dtype: string
splits:
- name: train
num_bytes: 702847
num_examples: 579
download_size: 0
dataset_size: 702847
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "t0_filtered_3pertask"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
EleutherAI/sycophancy | 2023-09-05T15:14:40.000Z | [
"region:us"
] | EleutherAI | This new dataset is designed to solve this great NLP task and is crafted with a lot of care. | @misc{perez2022discovering,
doi = {10.48550/ARXIV.2212.09251},
url = {https://arxiv.org/abs/2212.09251},
author = {Perez, Ethan and Ringer, Sam and Lukošiūtė, Kamilė and Nguyen, Karina and Chen, Edwin and Heiner, Scott and Pettit, Craig and Olsson, Catherine and Kundu, Sandipan and Kadavath, Saurav and Jones, Andy and Chen, Anna and Mann, Ben and Israel, Brian and Seethor, Bryan and McKinnon, Cameron and Olah, Christopher and Yan, Da and Amodei, Daniela and Amodei, Dario and Drain, Dawn and Li, Dustin and Tran-Johnson, Eli and Khundadze, Guro and Kernion, Jackson and Landis, James and Kerr, Jamie and Mueller, Jared and Hyun, Jeeyoon and Landau, Joshua and Ndousse, Kamal and Goldberg, Landon and Lovitt, Liane and Lucas, Martin and Sellitto, Michael and Zhang, Miranda and Kingsland, Neerav and Elhage, Nelson and Joseph, Nicholas and Mercado, Noemí and DasSarma, Nova and Rausch, Oliver and Larson, Robin and McCandlish, Sam and Johnston, Scott and Kravec, Shauna and {El Showk}, Sheer and Lanham, Tamera and Telleen-Lawton, Timothy and Brown, Tom and Henighan, Tom and Hume, Tristan and Bai, Yuntao and Hatfield-Dodds, Zac and Clark, Jack and Bowman, Samuel R. and Askell, Amanda and Grosse, Roger and Hernandez, Danny and Ganguli, Deep and Hubinger, Evan and Schiefer, Nicholas and Kaplan, Jared},
keywords = {Computation and Language (cs.CL), Artificial Intelligence (cs.AI), Machine Learning (cs.LG), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Discovering Language Model Behaviors with Model-Written Evaluations},
publisher = {arXiv},
year = {2022},
copyright = {arXiv.org perpetual, non-exclusive license}
} | null | 0 | 0 | Entry not found |
EleutherAI/advanced_ai_risk | 2023-10-10T14:47:31.000Z | [
"region:us"
] | EleutherAI | This new dataset is designed to solve this great NLP task and is crafted with a lot of care. | @misc{perez2022discovering,
doi = {10.48550/ARXIV.2212.09251},
url = {https://arxiv.org/abs/2212.09251},
author = {Perez, Ethan and Ringer, Sam and Lukošiūtė, Kamilė and Nguyen, Karina and Chen, Edwin and Heiner, Scott and Pettit, Craig and Olsson, Catherine and Kundu, Sandipan and Kadavath, Saurav and Jones, Andy and Chen, Anna and Mann, Ben and Israel, Brian and Seethor, Bryan and McKinnon, Cameron and Olah, Christopher and Yan, Da and Amodei, Daniela and Amodei, Dario and Drain, Dawn and Li, Dustin and Tran-Johnson, Eli and Khundadze, Guro and Kernion, Jackson and Landis, James and Kerr, Jamie and Mueller, Jared and Hyun, Jeeyoon and Landau, Joshua and Ndousse, Kamal and Goldberg, Landon and Lovitt, Liane and Lucas, Martin and Sellitto, Michael and Zhang, Miranda and Kingsland, Neerav and Elhage, Nelson and Joseph, Nicholas and Mercado, Noemí and DasSarma, Nova and Rausch, Oliver and Larson, Robin and McCandlish, Sam and Johnston, Scott and Kravec, Shauna and {El Showk}, Sheer and Lanham, Tamera and Telleen-Lawton, Timothy and Brown, Tom and Henighan, Tom and Hume, Tristan and Bai, Yuntao and Hatfield-Dodds, Zac and Clark, Jack and Bowman, Samuel R. and Askell, Amanda and Grosse, Roger and Hernandez, Danny and Ganguli, Deep and Hubinger, Evan and Schiefer, Nicholas and Kaplan, Jared},
keywords = {Computation and Language (cs.CL), Artificial Intelligence (cs.AI), Machine Learning (cs.LG), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Discovering Language Model Behaviors with Model-Written Evaluations},
publisher = {arXiv},
year = {2022},
copyright = {arXiv.org perpetual, non-exclusive license}
} | null | 0 | 0 | Entry not found |
slunayach/traindata | 2023-08-29T08:01:38.000Z | [
"region:us"
] | slunayach | null | null | null | 0 | 0 | Entry not found |
tyzhu/fwv2_squad_num_train_10_eval_10 | 2023-08-29T08:04:29.000Z | [
"region:us"
] | tyzhu | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: train_doc2id
path: data/train_doc2id-*
- split: train_id2doc
path: data/train_id2doc-*
- split: train_find_word
path: data/train_find_word-*
- split: eval_find_word
path: data/eval_find_word-*
- split: id_context_mapping
path: data/id_context_mapping-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 4388
num_examples: 30
- name: train_doc2id
num_bytes: 3294
num_examples: 20
- name: train_id2doc
num_bytes: 3354
num_examples: 20
- name: train_find_word
num_bytes: 1034
num_examples: 10
- name: eval_find_word
num_bytes: 1009
num_examples: 10
- name: id_context_mapping
num_bytes: 2714
num_examples: 20
download_size: 23053
dataset_size: 15793
---
# Dataset Card for "fwv2_squad_num_train_10_eval_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
usernamedesu/aichan-public-v3.1-jsonl | 2023-08-29T08:04:56.000Z | [
"region:us"
] | usernamedesu | null | null | null | 0 | 0 | Entry not found |
mickume/fandom_starwars | 2023-08-29T08:17:47.000Z | [
"region:us"
] | mickume | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 250280247
num_examples: 1135807
download_size: 139757383
dataset_size: 250280247
---
# Dataset Card for "fandom_starwars"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nlok5923/gpt_dataset | 2023-08-29T09:19:14.000Z | [
"region:us"
] | nlok5923 | null | null | null | 0 | 0 | Entry not found |
yzhuang/autotree_automl_eye_movements_gosdt_l512_d3_sd2 | 2023-08-29T08:32:12.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 10863200000
num_examples: 100000
- name: validation
num_bytes: 1086320000
num_examples: 10000
download_size: 2696882416
dataset_size: 11949520000
---
# Dataset Card for "autotree_automl_eye_movements_gosdt_l512_d3_sd2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
eonuora/test | 2023-08-29T08:34:35.000Z | [
"license:apache-2.0",
"region:us"
] | eonuora | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
on123123/datasets | 2023-09-05T16:03:10.000Z | [
"region:us"
] | on123123 | null | null | null | 0 | 0 | |
qnquang/zien-llama2-test | 2023-08-29T08:36:47.000Z | [
"region:us"
] | qnquang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 4319
num_examples: 13
download_size: 4354
dataset_size: 4319
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "zien-llama2-test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
LeeKinXUn/nuo_and_hua | 2023-08-30T01:28:59.000Z | [
"region:us"
] | LeeKinXUn | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_fangloveskari__ORCA_LLaMA_70B_QLoRA | 2023-09-23T16:47:43.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of fangloveskari/ORCA_LLaMA_70B_QLoRA
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [fangloveskari/ORCA_LLaMA_70B_QLoRA](https://huggingface.co/fangloveskari/ORCA_LLaMA_70B_QLoRA)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_fangloveskari__ORCA_LLaMA_70B_QLoRA\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T16:47:31.229796](https://huggingface.co/datasets/open-llm-leaderboard/details_fangloveskari__ORCA_LLaMA_70B_QLoRA/blob/main/results_2023-09-23T16-47-31.229796.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.3109270134228188,\n\
\ \"em_stderr\": 0.004740252668251192,\n \"f1\": 0.47044567953020594,\n\
\ \"f1_stderr\": 0.004325159736671571,\n \"acc\": 0.5600850420632693,\n\
\ \"acc_stderr\": 0.011402883443890944\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.3109270134228188,\n \"em_stderr\": 0.004740252668251192,\n\
\ \"f1\": 0.47044567953020594,\n \"f1_stderr\": 0.004325159736671571\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2835481425322214,\n \
\ \"acc_stderr\": 0.012415070917508125\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8366219415943172,\n \"acc_stderr\": 0.010390695970273764\n\
\ }\n}\n```"
repo_url: https://huggingface.co/fangloveskari/ORCA_LLaMA_70B_QLoRA
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|arc:challenge|25_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T16_47_31.229796
path:
- '**/details_harness|drop|3_2023-09-23T16-47-31.229796.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T16-47-31.229796.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T16_47_31.229796
path:
- '**/details_harness|gsm8k|5_2023-09-23T16-47-31.229796.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T16-47-31.229796.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hellaswag|10_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T08:51:06.198415.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T08:51:06.198415.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T08:51:06.198415.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T16_47_31.229796
path:
- '**/details_harness|winogrande|5_2023-09-23T16-47-31.229796.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T16-47-31.229796.parquet'
- config_name: results
data_files:
- split: 2023_08_29T08_51_06.198415
path:
- results_2023-08-29T08:51:06.198415.parquet
- split: 2023_09_23T16_47_31.229796
path:
- results_2023-09-23T16-47-31.229796.parquet
- split: latest
path:
- results_2023-09-23T16-47-31.229796.parquet
---
# Dataset Card for Evaluation run of fangloveskari/ORCA_LLaMA_70B_QLoRA
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/fangloveskari/ORCA_LLaMA_70B_QLoRA
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [fangloveskari/ORCA_LLaMA_70B_QLoRA](https://huggingface.co/fangloveskari/ORCA_LLaMA_70B_QLoRA) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_fangloveskari__ORCA_LLaMA_70B_QLoRA",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T16:47:31.229796](https://huggingface.co/datasets/open-llm-leaderboard/details_fangloveskari__ORCA_LLaMA_70B_QLoRA/blob/main/results_2023-09-23T16-47-31.229796.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.3109270134228188,
"em_stderr": 0.004740252668251192,
"f1": 0.47044567953020594,
"f1_stderr": 0.004325159736671571,
"acc": 0.5600850420632693,
"acc_stderr": 0.011402883443890944
},
"harness|drop|3": {
"em": 0.3109270134228188,
"em_stderr": 0.004740252668251192,
"f1": 0.47044567953020594,
"f1_stderr": 0.004325159736671571
},
"harness|gsm8k|5": {
"acc": 0.2835481425322214,
"acc_stderr": 0.012415070917508125
},
"harness|winogrande|5": {
"acc": 0.8366219415943172,
"acc_stderr": 0.010390695970273764
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_migtissera__Synthia-70B-v1.1 | 2023-09-23T19:08:23.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of migtissera/Synthia-70B-v1.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [migtissera/Synthia-70B-v1.1](https://huggingface.co/migtissera/Synthia-70B-v1.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_migtissera__Synthia-70B-v1.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T19:08:11.059191](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Synthia-70B-v1.1/blob/main/results_2023-09-23T19-08-11.059191.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.33326342281879195,\n\
\ \"em_stderr\": 0.004827370333271099,\n \"f1\": 0.39018036912751786,\n\
\ \"f1_stderr\": 0.004711418943333287,\n \"acc\": 0.5775224946788872,\n\
\ \"acc_stderr\": 0.011611460846674582\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.33326342281879195,\n \"em_stderr\": 0.004827370333271099,\n\
\ \"f1\": 0.39018036912751786,\n \"f1_stderr\": 0.004711418943333287\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.31842304776345715,\n \
\ \"acc_stderr\": 0.012832225723075403\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8366219415943172,\n \"acc_stderr\": 0.010390695970273763\n\
\ }\n}\n```"
repo_url: https://huggingface.co/migtissera/Synthia-70B-v1.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|arc:challenge|25_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T19_08_11.059191
path:
- '**/details_harness|drop|3_2023-09-23T19-08-11.059191.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T19-08-11.059191.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T19_08_11.059191
path:
- '**/details_harness|gsm8k|5_2023-09-23T19-08-11.059191.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T19-08-11.059191.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hellaswag|10_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T08:55:05.432450.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T08:55:05.432450.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T08:55:05.432450.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T19_08_11.059191
path:
- '**/details_harness|winogrande|5_2023-09-23T19-08-11.059191.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T19-08-11.059191.parquet'
- config_name: results
data_files:
- split: 2023_08_29T08_55_05.432450
path:
- results_2023-08-29T08:55:05.432450.parquet
- split: 2023_09_23T19_08_11.059191
path:
- results_2023-09-23T19-08-11.059191.parquet
- split: latest
path:
- results_2023-09-23T19-08-11.059191.parquet
---
# Dataset Card for Evaluation run of migtissera/Synthia-70B-v1.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/migtissera/Synthia-70B-v1.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [migtissera/Synthia-70B-v1.1](https://huggingface.co/migtissera/Synthia-70B-v1.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_migtissera__Synthia-70B-v1.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T19:08:11.059191](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Synthia-70B-v1.1/blob/main/results_2023-09-23T19-08-11.059191.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.33326342281879195,
"em_stderr": 0.004827370333271099,
"f1": 0.39018036912751786,
"f1_stderr": 0.004711418943333287,
"acc": 0.5775224946788872,
"acc_stderr": 0.011611460846674582
},
"harness|drop|3": {
"em": 0.33326342281879195,
"em_stderr": 0.004827370333271099,
"f1": 0.39018036912751786,
"f1_stderr": 0.004711418943333287
},
"harness|gsm8k|5": {
"acc": 0.31842304776345715,
"acc_stderr": 0.012832225723075403
},
"harness|winogrande|5": {
"acc": 0.8366219415943172,
"acc_stderr": 0.010390695970273763
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
tollefj/texts-en-no-comp_0.5_min_4 | 2023-08-29T08:57:11.000Z | [
"region:us"
] | tollefj | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: en
dtype: string
- name: 'no'
dtype: string
- name: cosine_sim
dtype: float64
splits:
- name: train
num_bytes: 56394201
num_examples: 643248
download_size: 41884314
dataset_size: 56394201
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "texts-en-no-comp_0.5_min_4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lion-schnei/lions-datasets | 2023-08-29T09:00:43.000Z | [
"region:us"
] | lion-schnei | null | null | null | 0 | 0 | Entry not found |
Zhutingho/ZH-HK | 2023-08-29T09:31:34.000Z | [
"region:us"
] | Zhutingho | null | null | null | 0 | 0 | Entry not found |
tollefj/texts-en-no-comp_0.5_min_10 | 2023-08-29T09:01:46.000Z | [
"region:us"
] | tollefj | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: en
dtype: string
- name: 'no'
dtype: string
- name: cosine_sim
dtype: float64
splits:
- name: train
num_bytes: 34359087
num_examples: 286356
download_size: 24958613
dataset_size: 34359087
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "texts-en-no-comp_0.5_min_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tollefj/texts-en-no-comp_0.5_min_20 | 2023-08-29T09:02:27.000Z | [
"region:us"
] | tollefj | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: en
dtype: string
- name: 'no'
dtype: string
- name: cosine_sim
dtype: float64
splits:
- name: train
num_bytes: 7343528
num_examples: 37730
download_size: 5131409
dataset_size: 7343528
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "texts-en-no-comp_0.5_min_20"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yzhuang/autotree_automl_eye_movements_gosdt_l512_d3_sd3 | 2023-08-29T09:09:35.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 10863200000
num_examples: 100000
- name: validation
num_bytes: 1086320000
num_examples: 10000
download_size: 2678676167
dataset_size: 11949520000
---
# Dataset Card for "autotree_automl_eye_movements_gosdt_l512_d3_sd3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
on123123/Sakura | 2023-08-29T09:32:38.000Z | [
"region:us"
] | on123123 | null | null | null | 0 | 0 | Entry not found |
saroj502/emotion-custom | 2023-08-29T09:07:11.000Z | [
"size_categories:n<1K",
"rlfh",
"argilla",
"human-feedback",
"region:us"
] | saroj502 | null | null | null | 0 | 0 | ---
size_categories: n<1K
tags:
- rlfh
- argilla
- human-feedback
---
# Dataset Card for emotion-custom
This dataset has been created with [Argilla](https://docs.argilla.io).
As shown in the sections below, this dataset can be loaded into Argilla as explained in [Load with Argilla](#load-with-argilla), or used directly with the `datasets` library in [Load with `datasets`](#load-with-datasets).
## Dataset Description
- **Homepage:** https://argilla.io
- **Repository:** https://github.com/argilla-io/argilla
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset contains:
* A dataset configuration file conforming to the Argilla dataset format named `argilla.yaml`. This configuration file will be used to configure the dataset when using the `FeedbackDataset.from_huggingface` method in Argilla.
* Dataset records in a format compatible with HuggingFace `datasets`. These records will be loaded automatically when using `FeedbackDataset.from_huggingface` and can be loaded independently using the `datasets` library via `load_dataset`.
* The [annotation guidelines](#annotation-guidelines) that have been used for building and curating the dataset, if they've been defined in Argilla.
### Load with Argilla
To load with Argilla, you'll just need to install Argilla as `pip install argilla --upgrade` and then use the following code:
```python
import argilla as rg
ds = rg.FeedbackDataset.from_huggingface("saroj502/emotion-custom")
```
### Load with `datasets`
To load this dataset with `datasets`, you'll just need to install `datasets` as `pip install datasets --upgrade` and then use the following code:
```python
from datasets import load_dataset
ds = load_dataset("saroj502/emotion-custom")
```
### Supported Tasks and Leaderboards
This dataset can contain [multiple fields, questions and responses](https://docs.argilla.io/en/latest/guides/llms/conceptual_guides/data_model.html) so it can be used for different NLP tasks, depending on the configuration. The dataset structure is described in the [Dataset Structure section](#dataset-structure).
There are no leaderboards associated with this dataset.
### Languages
[More Information Needed]
## Dataset Structure
### Data in Argilla
The dataset is created in Argilla with: **fields**, **questions**, **suggestions**, and **guidelines**.
The **fields** are the dataset records themselves, for the moment just text fields are suppported. These are the ones that will be used to provide responses to the questions.
| Field Name | Title | Type | Required | Markdown |
| ---------- | ----- | ---- | -------- | -------- |
| text | Text | TextField | True | False |
The **questions** are the questions that will be asked to the annotators. They can be of different types, such as rating, text, single choice, or multiple choice.
| Question Name | Title | Type | Required | Description | Values/Labels |
| ------------- | ----- | ---- | -------- | ----------- | ------------- |
| sentiment | Sentiment | LabelQuestion | True | N/A | ['positive', 'neutral', 'negative'] |
| mixed-emotion | Mixed-emotion | MultiLabelQuestion | True | N/A | ['joy', 'anger', 'sadness', 'fear', 'surprise', 'love'] |
**✨ NEW** Additionally, we also have **suggestions**, which are linked to the existing questions, and so on, named appending "-suggestion" and "-suggestion-metadata" to those, containing the value/s of the suggestion and its metadata, respectively. So on, the possible values are the same as in the table above.
Finally, the **guidelines** are just a plain string that can be used to provide instructions to the annotators. Find those in the [annotation guidelines](#annotation-guidelines) section.
### Data Instances
An example of a dataset instance in Argilla looks as follows:
```json
{
"fields": {
"text": "i didnt feel humiliated"
},
"metadata": {},
"responses": [],
"suggestions": []
}
```
While the same record in HuggingFace `datasets` looks as follows:
```json
{
"external_id": null,
"metadata": "{}",
"mixed-emotion": [],
"mixed-emotion-suggestion": null,
"mixed-emotion-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"sentiment": [],
"sentiment-suggestion": null,
"sentiment-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"text": "i didnt feel humiliated"
}
```
### Data Fields
Among the dataset fields, we differentiate between the following:
* **Fields:** These are the dataset records themselves, for the moment just text fields are suppported. These are the ones that will be used to provide responses to the questions.
* **text** is of type `TextField`.
* **Questions:** These are the questions that will be asked to the annotators. They can be of different types, such as `RatingQuestion`, `TextQuestion`, `LabelQuestion`, `MultiLabelQuestion`, and `RankingQuestion`.
* **sentiment** is of type `LabelQuestion` with the following allowed values ['positive', 'neutral', 'negative'].
* **mixed-emotion** is of type `MultiLabelQuestion` with the following allowed values ['joy', 'anger', 'sadness', 'fear', 'surprise', 'love'].
* **✨ NEW** **Suggestions:** As of Argilla 1.13.0, the suggestions have been included to provide the annotators with suggestions to ease or assist during the annotation process. Suggestions are linked to the existing questions, are always optional, and contain not just the suggestion itself, but also the metadata linked to it, if applicable.
* (optional) **sentiment-suggestion** is of type `label_selection` with the following allowed values ['positive', 'neutral', 'negative'].
* (optional) **mixed-emotion-suggestion** is of type `multi_label_selection` with the following allowed values ['joy', 'anger', 'sadness', 'fear', 'surprise', 'love'].
Additionally, we also have one more field which is optional and is the following:
* **external_id:** This is an optional field that can be used to provide an external ID for the dataset record. This can be useful if you want to link the dataset record to an external resource, such as a database or a file.
### Data Splits
The dataset contains a single split, which is `train`.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation guidelines
Emotion is a dataset of English Twitter messages with six basic emotions: anger, fear, joy, love, sadness, and surprise.
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
yzhuang/autotree_automl_eye_movements_gosdt_l512_d3_sd1 | 2023-08-29T09:10:46.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 10863200000
num_examples: 100000
- name: validation
num_bytes: 1086320000
num_examples: 10000
download_size: 2726682247
dataset_size: 11949520000
---
# Dataset Card for "autotree_automl_eye_movements_gosdt_l512_d3_sd1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_yeontaek__llama-2-13B-ensemble-v5 | 2023-08-29T09:19:12.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of yeontaek/llama-2-13B-ensemble-v5
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yeontaek/llama-2-13B-ensemble-v5](https://huggingface.co/yeontaek/llama-2-13B-ensemble-v5)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yeontaek__llama-2-13B-ensemble-v5\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-29T09:17:14.183323](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-13B-ensemble-v5/blob/main/results_2023-08-29T09%3A17%3A14.183323.json):\n\
\n```python\n{\n \"all\": {\n \"acc\": 0.5953117661059801,\n \"\
acc_stderr\": 0.03391896483304526,\n \"acc_norm\": 0.5994365516843435,\n\
\ \"acc_norm_stderr\": 0.033896234769528244,\n \"mc1\": 0.3843329253365973,\n\
\ \"mc1_stderr\": 0.017028707301245203,\n \"mc2\": 0.5327328500103707,\n\
\ \"mc2_stderr\": 0.015551697577870274\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5844709897610921,\n \"acc_stderr\": 0.014401366641216388,\n\
\ \"acc_norm\": 0.6262798634812287,\n \"acc_norm_stderr\": 0.014137708601759084\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6290579565823541,\n\
\ \"acc_stderr\": 0.004820697457420419,\n \"acc_norm\": 0.8306114319856602,\n\
\ \"acc_norm_stderr\": 0.0037432817493736324\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.03925523381052932,\n\
\ \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.03925523381052932\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6264150943396226,\n \"acc_stderr\": 0.029773082713319875,\n\
\ \"acc_norm\": 0.6264150943396226,\n \"acc_norm_stderr\": 0.029773082713319875\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.03942082639927213,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.03942082639927213\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5722543352601156,\n\
\ \"acc_stderr\": 0.037724468575180255,\n \"acc_norm\": 0.5722543352601156,\n\
\ \"acc_norm_stderr\": 0.037724468575180255\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105653,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105653\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.49361702127659574,\n \"acc_stderr\": 0.032683358999363366,\n\
\ \"acc_norm\": 0.49361702127659574,\n \"acc_norm_stderr\": 0.032683358999363366\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.36243386243386244,\n \"acc_stderr\": 0.02475747390275206,\n \"\
acc_norm\": 0.36243386243386244,\n \"acc_norm_stderr\": 0.02475747390275206\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6806451612903226,\n\
\ \"acc_stderr\": 0.026522709674667765,\n \"acc_norm\": 0.6806451612903226,\n\
\ \"acc_norm_stderr\": 0.026522709674667765\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\"\
: 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.03501438706296781,\n\
\ \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.03501438706296781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593552,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593552\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6076923076923076,\n \"acc_stderr\": 0.02475600038213095,\n \
\ \"acc_norm\": 0.6076923076923076,\n \"acc_norm_stderr\": 0.02475600038213095\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948496,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948496\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.031282177063684614,\n \
\ \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.031282177063684614\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8036697247706422,\n \"acc_stderr\": 0.017030719339154336,\n \"\
acc_norm\": 0.8036697247706422,\n \"acc_norm_stderr\": 0.017030719339154336\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321616,\n \"\
acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321616\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080852,\n \
\ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080852\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n\
\ \"acc_stderr\": 0.032277904428505,\n \"acc_norm\": 0.6367713004484304,\n\
\ \"acc_norm_stderr\": 0.032277904428505\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6717557251908397,\n \"acc_stderr\": 0.041184385658062976,\n\
\ \"acc_norm\": 0.6717557251908397,\n \"acc_norm_stderr\": 0.041184385658062976\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.71900826446281,\n \"acc_stderr\": 0.04103203830514512,\n \"acc_norm\"\
: 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514512\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n\
\ \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n\
\ \"acc_stderr\": 0.045723723587374296,\n \"acc_norm\": 0.36607142857142855,\n\
\ \"acc_norm_stderr\": 0.045723723587374296\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n\
\ \"acc_stderr\": 0.02336505149175372,\n \"acc_norm\": 0.8504273504273504,\n\
\ \"acc_norm_stderr\": 0.02336505149175372\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7879948914431673,\n\
\ \"acc_stderr\": 0.014616099385833685,\n \"acc_norm\": 0.7879948914431673,\n\
\ \"acc_norm_stderr\": 0.014616099385833685\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.025816756791584194,\n\
\ \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.025816756791584194\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4849162011173184,\n\
\ \"acc_stderr\": 0.01671489037999606,\n \"acc_norm\": 0.4849162011173184,\n\
\ \"acc_norm_stderr\": 0.01671489037999606\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6405228758169934,\n \"acc_stderr\": 0.027475969910660952,\n\
\ \"acc_norm\": 0.6405228758169934,\n \"acc_norm_stderr\": 0.027475969910660952\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.024922001168886335,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.024922001168886335\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4654498044328553,\n\
\ \"acc_stderr\": 0.012739711554045708,\n \"acc_norm\": 0.4654498044328553,\n\
\ \"acc_norm_stderr\": 0.012739711554045708\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5992647058823529,\n \"acc_stderr\": 0.029768263528933105,\n\
\ \"acc_norm\": 0.5992647058823529,\n \"acc_norm_stderr\": 0.029768263528933105\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6029411764705882,\n \"acc_stderr\": 0.019794488900024117,\n \
\ \"acc_norm\": 0.6029411764705882,\n \"acc_norm_stderr\": 0.019794488900024117\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910507,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910507\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6530612244897959,\n \"acc_stderr\": 0.030472526026726496,\n\
\ \"acc_norm\": 0.6530612244897959,\n \"acc_norm_stderr\": 0.030472526026726496\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7711442786069652,\n\
\ \"acc_stderr\": 0.02970528405677244,\n \"acc_norm\": 0.7711442786069652,\n\
\ \"acc_norm_stderr\": 0.02970528405677244\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.03889951252827217,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.03889951252827217\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3843329253365973,\n\
\ \"mc1_stderr\": 0.017028707301245203,\n \"mc2\": 0.5327328500103707,\n\
\ \"mc2_stderr\": 0.015551697577870274\n }\n}\n```"
repo_url: https://huggingface.co/yeontaek/llama-2-13B-ensemble-v5
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|arc:challenge|25_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hellaswag|10_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T09:17:14.183323.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T09:17:14.183323.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T09:17:14.183323.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T09:17:14.183323.parquet'
- config_name: results
data_files:
- split: 2023_08_29T09_17_14.183323
path:
- results_2023-08-29T09:17:14.183323.parquet
- split: latest
path:
- results_2023-08-29T09:17:14.183323.parquet
---
# Dataset Card for Evaluation run of yeontaek/llama-2-13B-ensemble-v5
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/yeontaek/llama-2-13B-ensemble-v5
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [yeontaek/llama-2-13B-ensemble-v5](https://huggingface.co/yeontaek/llama-2-13B-ensemble-v5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yeontaek__llama-2-13B-ensemble-v5",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-29T09:17:14.183323](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-13B-ensemble-v5/blob/main/results_2023-08-29T09%3A17%3A14.183323.json):
```python
{
"all": {
"acc": 0.5953117661059801,
"acc_stderr": 0.03391896483304526,
"acc_norm": 0.5994365516843435,
"acc_norm_stderr": 0.033896234769528244,
"mc1": 0.3843329253365973,
"mc1_stderr": 0.017028707301245203,
"mc2": 0.5327328500103707,
"mc2_stderr": 0.015551697577870274
},
"harness|arc:challenge|25": {
"acc": 0.5844709897610921,
"acc_stderr": 0.014401366641216388,
"acc_norm": 0.6262798634812287,
"acc_norm_stderr": 0.014137708601759084
},
"harness|hellaswag|10": {
"acc": 0.6290579565823541,
"acc_stderr": 0.004820697457420419,
"acc_norm": 0.8306114319856602,
"acc_norm_stderr": 0.0037432817493736324
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6264150943396226,
"acc_stderr": 0.029773082713319875,
"acc_norm": 0.6264150943396226,
"acc_norm_stderr": 0.029773082713319875
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03942082639927213,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03942082639927213
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5722543352601156,
"acc_stderr": 0.037724468575180255,
"acc_norm": 0.5722543352601156,
"acc_norm_stderr": 0.037724468575180255
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105653,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105653
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.49361702127659574,
"acc_stderr": 0.032683358999363366,
"acc_norm": 0.49361702127659574,
"acc_norm_stderr": 0.032683358999363366
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36243386243386244,
"acc_stderr": 0.02475747390275206,
"acc_norm": 0.36243386243386244,
"acc_norm_stderr": 0.02475747390275206
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6806451612903226,
"acc_stderr": 0.026522709674667765,
"acc_norm": 0.6806451612903226,
"acc_norm_stderr": 0.026522709674667765
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593552,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593552
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6076923076923076,
"acc_stderr": 0.02475600038213095,
"acc_norm": 0.6076923076923076,
"acc_norm_stderr": 0.02475600038213095
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948496,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948496
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.634453781512605,
"acc_stderr": 0.031282177063684614,
"acc_norm": 0.634453781512605,
"acc_norm_stderr": 0.031282177063684614
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.03879687024073327,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.03879687024073327
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8036697247706422,
"acc_stderr": 0.017030719339154336,
"acc_norm": 0.8036697247706422,
"acc_norm_stderr": 0.017030719339154336
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321616,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321616
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.02747974455080852,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.02747974455080852
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6367713004484304,
"acc_stderr": 0.032277904428505,
"acc_norm": 0.6367713004484304,
"acc_norm_stderr": 0.032277904428505
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6717557251908397,
"acc_stderr": 0.041184385658062976,
"acc_norm": 0.6717557251908397,
"acc_norm_stderr": 0.041184385658062976
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.035123852837050475,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.035123852837050475
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.045723723587374296,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.045723723587374296
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.02336505149175372,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.02336505149175372
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7879948914431673,
"acc_stderr": 0.014616099385833685,
"acc_norm": 0.7879948914431673,
"acc_norm_stderr": 0.014616099385833685
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.025816756791584194,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.025816756791584194
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4849162011173184,
"acc_stderr": 0.01671489037999606,
"acc_norm": 0.4849162011173184,
"acc_norm_stderr": 0.01671489037999606
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6405228758169934,
"acc_stderr": 0.027475969910660952,
"acc_norm": 0.6405228758169934,
"acc_norm_stderr": 0.027475969910660952
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.024922001168886335,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.024922001168886335
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4654498044328553,
"acc_stderr": 0.012739711554045708,
"acc_norm": 0.4654498044328553,
"acc_norm_stderr": 0.012739711554045708
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5992647058823529,
"acc_stderr": 0.029768263528933105,
"acc_norm": 0.5992647058823529,
"acc_norm_stderr": 0.029768263528933105
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6029411764705882,
"acc_stderr": 0.019794488900024117,
"acc_norm": 0.6029411764705882,
"acc_norm_stderr": 0.019794488900024117
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910507,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910507
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6530612244897959,
"acc_stderr": 0.030472526026726496,
"acc_norm": 0.6530612244897959,
"acc_norm_stderr": 0.030472526026726496
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7711442786069652,
"acc_stderr": 0.02970528405677244,
"acc_norm": 0.7711442786069652,
"acc_norm_stderr": 0.02970528405677244
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.03889951252827217,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.03889951252827217
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3843329253365973,
"mc1_stderr": 0.017028707301245203,
"mc2": 0.5327328500103707,
"mc2_stderr": 0.015551697577870274
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
namvandy/mood_encode | 2023-08-29T09:28:41.000Z | [
"region:us"
] | namvandy | null | null | null | 0 | 0 | Entry not found |
aimona/eng-conversations_no-tokenizer_no-time | 2023-08-29T09:22:21.000Z | [
"region:us"
] | aimona | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instructions
dtype: string
splits:
- name: train
num_bytes: 384592697
num_examples: 30052
download_size: 177374362
dataset_size: 384592697
---
# Dataset Card for "eng-conversations_no-tokenizer_no-time"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tollefj/texts-en-no-comp_0.65_min_4 | 2023-08-29T09:25:17.000Z | [
"region:us"
] | tollefj | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: en
dtype: string
- name: 'no'
dtype: string
- name: cosine_sim
dtype: float64
splits:
- name: train
num_bytes: 137252308
num_examples: 1497706
download_size: 102243070
dataset_size: 137252308
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "texts-en-no-comp_0.65_min_4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
junlee666/ddd | 2023-08-29T09:25:12.000Z | [
"region:us"
] | junlee666 | null | null | null | 0 | 0 | Entry not found |
yzhuang/autotree_automl_MagicTelescope_gosdt_l512_d3_sd1 | 2023-08-29T09:31:40.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 6767200000
num_examples: 100000
- name: validation
num_bytes: 676720000
num_examples: 10000
download_size: 2605525039
dataset_size: 7443920000
---
# Dataset Card for "autotree_automl_MagicTelescope_gosdt_l512_d3_sd1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tollefj/texts-en-no-comp_0.65_min_10 | 2023-08-29T09:34:58.000Z | [
"region:us"
] | tollefj | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: en
dtype: string
- name: 'no'
dtype: string
- name: cosine_sim
dtype: float64
splits:
- name: train
num_bytes: 79815538
num_examples: 624414
download_size: 58104604
dataset_size: 79815538
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "texts-en-no-comp_0.65_min_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tollefj/texts-en-no-comp_0.65_min_20 | 2023-08-29T09:36:25.000Z | [
"region:us"
] | tollefj | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: en
dtype: string
- name: 'no'
dtype: string
- name: cosine_sim
dtype: float64
splits:
- name: train
num_bytes: 17814642
num_examples: 86101
download_size: 12507510
dataset_size: 17814642
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "texts-en-no-comp_0.65_min_20"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cinnachroma/cinnachroma-blood-sugar-support | 2023-08-29T09:44:42.000Z | [
"region:us"
] | cinnachroma | null | null | null | 0 | 0 | **Product Name** - [CinnaChroma](https://cinnachroma-advanced-blood-sugar-support.jimdosite.com/)
**Category** - Blood Sugar Support Formula
**Result** - 2-3 Months
**Country** - USA
**Availability** - [Online](https://www.healthsupplement24x7.com/get-cinnachroma)
**Official Website** - [https://www.healthsupplement24x7.com/get-cinnachroma](https://www.healthsupplement24x7.com/get-cinnachroma)
[CinnaChroma](https://www.sympla.com.br/evento/cinnachroma-customer-reviews-does-it-really-work-read-full-article-to-know-more/2139700), - a brand well-known in the health market - is a complete nutritional supplement. The natural ingredients in this supplement work together in a synergistic way. Each ingredient was selected for its ability to promote healthy blood. This powerful combination is designed to help people maintain healthy blood sugar levels and control their blood pressure. Due to their effectiveness, [CinnaChroma](https://healthsupplements24x7.blogspot.com/2023/08/cinnachroma.html) tablets are increasingly popular among those who want to find natural solutions to their blood health problems.
[](https://www.healthsupplement24x7.com/get-cinnachroma)
### [**Click Here – OFFICIAL WEBSITE (CinnaChroma)**](https://www.healthsupplement24x7.com/get-cinnachroma)
**What Is CinnaChroma?**
------------------------
According to the creators of **[CinnaChroma](https://www.ivoox.com/cinnachroma-customer-reviews-does-it-really-work-read-audios-mp3_rf_115025778_1.html)**, this is a brand-new product that promises to control blood pressure and sugar levels. CinnaChroma's makers have made it a point to say that the product is superior to other products currently available. CinnaChroma is said to contain scientifically-proven natural ingredients that regulate blood pressure and high blood pressure. [CinnaChroma](https://www.townscript.com/e/cinnachroma-042430) operates with an action triple that results in immediate results. They believe that [CinnaChroma](https://pdfhost.io/v/r3dZWleKe_CinnaChroma_Customer_Reviews_Does_It_Really_Work_Read_Full_Article_To_Know_More) will aid in blood pressure and blood sugar management, in addition to losing weight and reducing weight loss.
**How Does CinnaChroma Work?**
------------------------------
Human health is at risk from high blood sugar levels. People who have this condition need to find immediate treatment. Due to its effectiveness in maintaining and regulating blood sugar levels, it stands as the best remedy for fluctuating blood sugar levels.
It’s significant that it contains six potent ingredients that attack the source of diabetes and help to naturally cure it. Users of this formula are not required to give up their favourite foods or alter their lifestyles, according to the manufacturer. Users can lose weight naturally and securely with blood sugar support formula.
**Benefits of CinnaChroma Supplement**
--------------------------------------
If you consume this supplement regularly for three to six months, you’re bound to experience life-changing prostate health benefits such as…
**Help Ease Worries of Diabetic Blindness**: The most common cause of blindness in the world is diabetic retinopathy. CinnaChroma, according to Barton Nutrition, can support severe retinal and optic nerve damage, easing concerns about diabetic blindness. Your optic nerves and retina are damaged by high blood sugar levels, but CinnaChroma is said to support these areas of your eyes to reduce concerns about blindness.
**Eliminate Blurry Vision and Floaters**: CinnaChroma allegedly treats eye nerve damage to eliminate floaters and blurry vision. Usually, diabetic retinopathy patients must endure a lifetime of retinal injections, vision loss, and eventual blindness. CinnaChroma shields damaged blood vessels from these issues.
Insulin resistance is the primary cause of Type 2 diabetes, and this supplement works to reverse it in order to keep you healthy.
**Support Healthy Blood Pressure**: It exclusively combines the most tried-and-true ingredients in order to support healthy blood pressure that is already within the normal range.
**Protect Retina Cells from Diabetes**: Retinal cells are impacted by diabetes. Your retina and optic nerve can become damaged by high blood sugar levels over time. CinnaChroma, however, is said to be able to shield your retinal cells from the effects of diabetes.
**Drop Blood Sugar Levels by 24%:** Diabetics typically use prescription drugs to bring their blood sugar levels down to normal levels. However, CinnaChroma can reduce blood sugar levels by 24%, according to the official Barton Nutrition website. According to the manufacturer, the formula has been "proven to reduce blood glucose levels dramatically fast" and can protect retina cells by "naturally dropping blood sugar levels by 24%."
**Promote a Healthy Insulin Response**: When blood sugar levels increase, your body responds by producing insulin. If you have diabetes, your body reacts to insulin differently from someone who does not. CinnaChroma, however, reportedly helps nitric oxide production, which in turn supports a healthy insulin response. Nitric oxide is essential for the delivery of insulin.
[.png)](https://www.healthsupplement24x7.com/get-cinnachroma)
### [**Enjoy The Benefits Of CinnaChroma – Order Now By Clicking Here!**](https://www.healthsupplement24x7.com/get-cinnachroma)
**Ingredients Used To Manufacture CinnaChroma**
-----------------------------------------------
The CinnaChroma formula is a potent mix of all-natural ingredients that help in enhancing retinal health. The list of these super-ingredients along with their properties are listed below:
**Chromium Picolinate :** Blood sugar control is another benefit of chromium picolinate. To reduce fasting blood glucose, chromium and picolinate both contribute. Together, they can reduce blood glucose levels by 300% and post-meal glucose levels by 200%. Chromium is guided into the cells of your digestive tract by the picolinate, preventing it from being lost along the way.
**Vitamin D3 :** Another advantageous component that functions as a regulatory hormone is vitamin D3. Vitamin D3 supports glucose uptake into cells and increases insulin release to ensure that your biological processes run as smoothly as possible. Additionally, vitamin D3 has been shown to reduce blood pressure and treat depression. Additionally, it strengthens the immune system. According to Dr. Saunders' advice, the CinnaChroma supplement contains 5000 IUs.
**Vitamin K2 :** Additionally crucial to lowering high blood sugar levels is vitamin K2. It effectively enables the body to use vitamin D3 where it is needed. Along with enhancing your general health, vitamin K2 also aids in reducing inflammation.
**Vanadium :** Vanadium works by transporting blood glucose into your cells where it can be converted to energy. Vanadium and chromium work together to produce amazing effects. It first aids in lowering blood sugar levels and aids in suppressing cravings, enabling weight loss.
**Selenium :** The final component, selenium, works similarly to vitamin K2 in reducing inflammation. Selenium has also been shown to reduce the risk of cancer by 37%. Additionally, it works wonders for enhancing general wellbeing.
**Benfotiamine :** This component aids in blood sugar regulation and has been used for many years to treat diabetes-related nerve damage. Additionally, it improves nerve conduction, nerve damage, blood vessel health, and other problems brought on by diabetes. Additionally, this component lessens retinal cell death and shields them from high blood glucose levels.
[](https://www.healthsupplement24x7.com/get-cinnachroma)
### [**(Promo Offer) Visit The Official CinnaChroma Website To Order**](https://www.healthsupplement24x7.com/get-cinnachroma)
**How To Use CinnaChroma For Best Results?**
--------------------------------------------
CinnaChroma comes in capsule form, and every capsule contains a potent blend of 12 natural ingredients rich in vital vitamins, minerals, and nutrients. For best results, three capsules should be taken daily, ideally one capsule after each meal. It is recommended to remain consistent to see CinnaChroma work in full force.
**Side Effects Of Using CinnaChroma**
-------------------------------------
[CinnaChroma](https://colab.research.google.com/drive/1pczQ59tbV3UqJnTiHHgo8_PCR_vq3JK3) is a pure solution made with the best ingredients. Side effects were rare for those who took the recommended dose. The product was made in an FDA-approved, GMP-certified manufacturing facility.
Each bottle of [CinnaChroma](https://devfolio.co/projects/cinnachroma-ae41) is free from any preservatives or herbicides, stimulants or other chemicals that could harm your health. It may take some time but the results you will get are real. [CinnaChroma](https://form.jotform.com/cinnachroma/cinnachroma-blood-sugar-support) reviews have shown that the ingredients are completely natural and safe. [CinnaChroma](https://events.humanitix.com/cinnachroma-customer-reviews-does-it-really-work-read-full-article-to-know-more) side effects are therefore minimal.
**CinnaChroma Pricing**
-----------------------
_**1 Bottle - $59 per bottle - Plus Shipping & Handling**_
_**3 Bottles - $49 per bottle + Free Shipping in USA + Bonus! Free Digital Book**_
_**6 Bottles - $39 per bottle + Free Shipping in USA + Bonus! Free Digital Book**_
[.png)](https://www.healthsupplement24x7.com/get-cinnachroma)
### [**(Free Bonus Ebook) Order CinnaChroma From Its Official Website Here!**](https://www.healthsupplement24x7.com/get-cinnachroma)
**PLUS+ Exclusive Online FREE Recommended Bonus The Blood Sugar Solution Kit**
------------------------------------------------------------------------------
* The Stable Blood Sugar Resource Guide
* Natural Remedies for Erratic Blood Sugar
* The Low Blood Sugar Cookbook
* Carb-Counting Cheat Sheet
* Personal Meal & Exercise Planner
* Blood Sugar Solution Grocery List
* Free (One on One) Program Customization Coaching Call
[.png)](https://www.healthsupplement24x7.com/get-cinnachroma)
**100% Money Back Guarantee**
-----------------------------
[CinnaChroma](https://www.fuzia.com/article_detail/801095/cinnachroma-customer-reviews-does-it-really-work) comes with a 365 Day, 100% Money Back Guarantee. That means if you change your mind about this decision at any point in the next 365 days... all you need to do is contact our US based Customer Service and we will refund your purchase. You can even keep the FREE Blood Sugar Solution Kit bonus as our way of saying "thanks" for giving [CinnaChroma](https://www.businesslistings.net.au/CinnaChroma_Blood_Sugar_Support/NSW/Phoenix_Park/CinnaChroma/887028.aspx) a try...
**Where To Buy CinnaChroma?**
-----------------------------
The [CinnaChroma](https://www.scoop.it/topic/cinnachroma-blood-sugar-support) supplement is available on the official website only. Nevertheless, you may find these supplements on third-party websites, but there is no guarantee of their quality or efficiency. Therefore, purchasing the [CinnaChroma](https://cinnachroma-usa.clubeo.com/page/cinnachroma-blood-sugar-support-does-it-really-work-read-full-article-to-know-more.html) from the official website.
**The Bottom Line**
-------------------
The natural diabetes plant extracts of Gymnema Sylvestre, berberine, black plum, shilajit, licorice root, guggul, and bitter melon have all received excellent research attention. Interestingly enough, [CinnaChroma](https://cinnachroma-usa.clubeo.com/calendar/2023/08/29/cinnachroma-customer-reviews-does-it-really-work-read-full-article-to-know-more) is the only available natural supplement with all of these ingredients in its formulation.
The unique formula of [CinnaChroma](https://cinnachroma-usa.clubeo.com/) is what makes it stand out among all other natural diabetes supplements, and this is also the reason behind the immediate results [CinnaChroma](https://cinnachroma-usa.clubeo.com/page/cinnachroma-customer-reviews-does-it-really-work-read-full-article-to-know-more.html) can produce.
[.png)](https://www.healthsupplement24x7.com/get-cinnachroma)
### **[Do Not Miss Out On Special Discount At The Official Website Of CinnaChroma](https://www.healthsupplement24x7.com/get-cinnachroma)**
[https://healthsupplements24x7.blogspot.com/2023/08/cinnachroma.html](https://healthsupplements24x7.blogspot.com/2023/08/cinnachroma.html)
[https://cinnachroma-usa.clubeo.com/](https://cinnachroma-usa.clubeo.com/)
[https://cinnachroma-usa.clubeo.com/calendar/2023/08/29/cinnachroma-customer-reviews-does-it-really-work-read-full-article-to-know-more](https://cinnachroma-usa.clubeo.com/calendar/2023/08/29/cinnachroma-customer-reviews-does-it-really-work-read-full-article-to-know-more)
[https://cinnachroma-usa.clubeo.com/page/cinnachroma-customer-reviews-does-it-really-work-read-full-article-to-know-more.html](https://cinnachroma-usa.clubeo.com/page/cinnachroma-customer-reviews-does-it-really-work-read-full-article-to-know-more.html)
[https://cinnachroma-usa.clubeo.com/page/cinnachroma-blood-sugar-support-does-it-really-work-read-full-article-to-know-more.html](https://cinnachroma-usa.clubeo.com/page/cinnachroma-blood-sugar-support-does-it-really-work-read-full-article-to-know-more.html)
[https://www.ivoox.com/cinnachroma-customer-reviews-does-it-really-work-read-audios-mp3\_rf\_115025778\_1.html](https://www.ivoox.com/cinnachroma-customer-reviews-does-it-really-work-read-audios-mp3_rf_115025778_1.html)
[https://cinnachroma-advanced-blood-sugar-support.jimdosite.com/](https://cinnachroma-advanced-blood-sugar-support.jimdosite.com/)
[https://www.townscript.com/e/cinnachroma-042430](https://www.townscript.com/e/cinnachroma-042430)
[https://www.scoop.it/topic/cinnachroma-blood-sugar-support](https://www.scoop.it/topic/cinnachroma-blood-sugar-support)
[https://www.fuzia.com/article\_detail/801095/cinnachroma-customer-reviews-does-it-really-work](https://www.fuzia.com/article_detail/801095/cinnachroma-customer-reviews-does-it-really-work)
[https://www.sympla.com.br/evento/cinnachroma-customer-reviews-does-it-really-work-read-full-article-to-know-more/2139700](https://www.sympla.com.br/evento/cinnachroma-customer-reviews-does-it-really-work-read-full-article-to-know-more/2139700)
[https://pdfhost.io/v/r3dZWleKe\_CinnaChroma\_Customer\_Reviews\_Does\_It\_Really\_Work\_Read\_Full\_Article\_To\_Know\_More](https://pdfhost.io/v/r3dZWleKe_CinnaChroma_Customer_Reviews_Does_It_Really_Work_Read_Full_Article_To_Know_More)
[https://colab.research.google.com/drive/1rtlmSgMsLDcr1nRAVOaMXQmKvAIpACF0](https://colab.research.google.com/drive/1rtlmSgMsLDcr1nRAVOaMXQmKvAIpACF0)
[https://colab.research.google.com/drive/17Xfu9aFQDCHsEIxGfwfe2f04ArdHGB1T](https://colab.research.google.com/drive/17Xfu9aFQDCHsEIxGfwfe2f04ArdHGB1T)
[https://colab.research.google.com/drive/1ys9ZjKUEZVx8BeB2PavDU8OrBbOBizpZ](https://colab.research.google.com/drive/1ys9ZjKUEZVx8BeB2PavDU8OrBbOBizpZ)
[https://colab.research.google.com/drive/1QvHZMnU9dJgucz5on6R4CNXDxWPaSQg6](https://colab.research.google.com/drive/1QvHZMnU9dJgucz5on6R4CNXDxWPaSQg6)
[https://colab.research.google.com/drive/1pczQ59tbV3UqJnTiHHgo8\_PCR\_vq3JK3](https://colab.research.google.com/drive/1pczQ59tbV3UqJnTiHHgo8_PCR_vq3JK3)
[https://www.businesslistings.net.au/CinnaChroma\_Blood\_Sugar\_Support/NSW/Phoenix\_Park/CinnaChroma/887028.aspx](https://www.businesslistings.net.au/CinnaChroma_Blood_Sugar_Support/NSW/Phoenix_Park/CinnaChroma/887028.aspx)
[https://devfolio.co/@cinnachroma\_usa](https://devfolio.co/@cinnachroma_usa)
[https://devfolio.co/projects/cinnachroma-ae41](https://devfolio.co/projects/cinnachroma-ae41)
[https://form.jotform.com/cinnachroma/cinnachroma-blood-sugar-support](https://form.jotform.com/cinnachroma/cinnachroma-blood-sugar-support)
[https://lexcliq.com/cinnachroma-customer-reviews-does-it-really-work-read-full-article-to-know-more/](https://lexcliq.com/cinnachroma-customer-reviews-does-it-really-work-read-full-article-to-know-more/)
[https://hackmd.io/@cinnachroma/cinnachroma-blood-sugar-support](https://hackmd.io/@cinnachroma/cinnachroma-blood-sugar-support) |
jayant26/llama-train | 2023-08-29T09:45:25.000Z | [
"region:us"
] | jayant26 | null | null | null | 0 | 0 | Entry not found |
amasing7/sf-dataset | 2023-08-29T10:21:08.000Z | [
"region:us"
] | amasing7 | null | null | null | 0 | 0 | Entry not found |
HydraLM/GPT4-10k-standardized | 2023-08-30T20:29:36.000Z | [
"region:us"
] | HydraLM | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
splits:
- name: train
num_bytes: 4380667
num_examples: 4052
download_size: 2227926
dataset_size: 4380667
---
# Dataset Card for "GPT4-10k-standardized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Elifslh/testt | 2023-08-29T10:02:17.000Z | [
"region:us"
] | Elifslh | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_RobbeD__Orca-Platypus-3B | 2023-08-29T10:09:17.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of RobbeD/Orca-Platypus-3B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [RobbeD/Orca-Platypus-3B](https://huggingface.co/RobbeD/Orca-Platypus-3B) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_RobbeD__Orca-Platypus-3B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-29T10:07:29.426848](https://huggingface.co/datasets/open-llm-leaderboard/details_RobbeD__Orca-Platypus-3B/blob/main/results_2023-08-29T10%3A07%3A29.426848.json):\n\
\n```python\n{\n \"all\": {\n \"acc\": 0.27366722319077513,\n \"\
acc_stderr\": 0.03210093803398038,\n \"acc_norm\": 0.2768555704328155,\n\
\ \"acc_norm_stderr\": 0.0320995646677269,\n \"mc1\": 0.27539779681762544,\n\
\ \"mc1_stderr\": 0.01563813566777552,\n \"mc2\": 0.41928517905056045,\n\
\ \"mc2_stderr\": 0.0152672030417133\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3993174061433447,\n \"acc_stderr\": 0.014312094557946707,\n\
\ \"acc_norm\": 0.4308873720136519,\n \"acc_norm_stderr\": 0.014471133392642476\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4967138020314678,\n\
\ \"acc_stderr\": 0.004989673640014264,\n \"acc_norm\": 0.6532563234415455,\n\
\ \"acc_norm_stderr\": 0.004749606196363324\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.34074074074074073,\n\
\ \"acc_stderr\": 0.040943762699967926,\n \"acc_norm\": 0.34074074074074073,\n\
\ \"acc_norm_stderr\": 0.040943762699967926\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.28289473684210525,\n \"acc_stderr\": 0.03665349695640767,\n\
\ \"acc_norm\": 0.28289473684210525,\n \"acc_norm_stderr\": 0.03665349695640767\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.27,\n\
\ \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.27,\n \
\ \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2792452830188679,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.2792452830188679,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3055555555555556,\n\
\ \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.3055555555555556,\n\
\ \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.15,\n \"acc_stderr\": 0.035887028128263714,\n \
\ \"acc_norm\": 0.15,\n \"acc_norm_stderr\": 0.035887028128263714\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\"\
: 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n\
\ \"acc_stderr\": 0.03242414757483098,\n \"acc_norm\": 0.23699421965317918,\n\
\ \"acc_norm_stderr\": 0.03242414757483098\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179963,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179963\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.25957446808510637,\n \"acc_stderr\": 0.028659179374292323,\n\
\ \"acc_norm\": 0.25957446808510637,\n \"acc_norm_stderr\": 0.028659179374292323\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21929824561403508,\n\
\ \"acc_stderr\": 0.03892431106518754,\n \"acc_norm\": 0.21929824561403508,\n\
\ \"acc_norm_stderr\": 0.03892431106518754\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2689655172413793,\n \"acc_stderr\": 0.036951833116502325,\n\
\ \"acc_norm\": 0.2689655172413793,\n \"acc_norm_stderr\": 0.036951833116502325\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.19047619047619047,\n\
\ \"acc_stderr\": 0.03512207412302052,\n \"acc_norm\": 0.19047619047619047,\n\
\ \"acc_norm_stderr\": 0.03512207412302052\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.22258064516129034,\n\
\ \"acc_stderr\": 0.023664216671642535,\n \"acc_norm\": 0.22258064516129034,\n\
\ \"acc_norm_stderr\": 0.023664216671642535\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2315270935960591,\n \"acc_stderr\": 0.029678333141444437,\n\
\ \"acc_norm\": 0.2315270935960591,\n \"acc_norm_stderr\": 0.029678333141444437\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.30303030303030304,\n \"acc_stderr\": 0.035886248000917075,\n\
\ \"acc_norm\": 0.30303030303030304,\n \"acc_norm_stderr\": 0.035886248000917075\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2727272727272727,\n \"acc_stderr\": 0.03173071239071724,\n \"\
acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.03173071239071724\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.25906735751295334,\n \"acc_stderr\": 0.031618779179354094,\n\
\ \"acc_norm\": 0.25906735751295334,\n \"acc_norm_stderr\": 0.031618779179354094\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.23846153846153847,\n \"acc_stderr\": 0.021606294494647727,\n\
\ \"acc_norm\": 0.23846153846153847,\n \"acc_norm_stderr\": 0.021606294494647727\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712163,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712163\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.22268907563025211,\n \"acc_stderr\": 0.027025433498882364,\n\
\ \"acc_norm\": 0.22268907563025211,\n \"acc_norm_stderr\": 0.027025433498882364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.24403669724770644,\n \"acc_stderr\": 0.01841528635141641,\n \"\
acc_norm\": 0.24403669724770644,\n \"acc_norm_stderr\": 0.01841528635141641\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.20833333333333334,\n \"acc_stderr\": 0.027696910713093936,\n \"\
acc_norm\": 0.20833333333333334,\n \"acc_norm_stderr\": 0.027696910713093936\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24019607843137256,\n \"acc_stderr\": 0.02998373305591361,\n \"\
acc_norm\": 0.24019607843137256,\n \"acc_norm_stderr\": 0.02998373305591361\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.26582278481012656,\n \"acc_stderr\": 0.028756799629658342,\n \
\ \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.028756799629658342\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.2600896860986547,\n\
\ \"acc_stderr\": 0.029442495585857476,\n \"acc_norm\": 0.2600896860986547,\n\
\ \"acc_norm_stderr\": 0.029442495585857476\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.20610687022900764,\n \"acc_stderr\": 0.035477710041594626,\n\
\ \"acc_norm\": 0.20610687022900764,\n \"acc_norm_stderr\": 0.035477710041594626\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.4049586776859504,\n \"acc_stderr\": 0.044811377559424694,\n \"\
acc_norm\": 0.4049586776859504,\n \"acc_norm_stderr\": 0.044811377559424694\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3055555555555556,\n\
\ \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.3055555555555556,\n\
\ \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3128834355828221,\n \"acc_stderr\": 0.036429145782924055,\n\
\ \"acc_norm\": 0.3128834355828221,\n \"acc_norm_stderr\": 0.036429145782924055\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.19642857142857142,\n\
\ \"acc_stderr\": 0.03770970049347018,\n \"acc_norm\": 0.19642857142857142,\n\
\ \"acc_norm_stderr\": 0.03770970049347018\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690877,\n\
\ \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690877\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3076923076923077,\n\
\ \"acc_stderr\": 0.03023638994217309,\n \"acc_norm\": 0.3076923076923077,\n\
\ \"acc_norm_stderr\": 0.03023638994217309\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.34738186462324394,\n\
\ \"acc_stderr\": 0.01702667174865574,\n \"acc_norm\": 0.34738186462324394,\n\
\ \"acc_norm_stderr\": 0.01702667174865574\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.30346820809248554,\n \"acc_stderr\": 0.02475241196091721,\n\
\ \"acc_norm\": 0.30346820809248554,\n \"acc_norm_stderr\": 0.02475241196091721\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2446927374301676,\n\
\ \"acc_stderr\": 0.014378169884098447,\n \"acc_norm\": 0.2446927374301676,\n\
\ \"acc_norm_stderr\": 0.014378169884098447\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.28104575163398693,\n \"acc_stderr\": 0.025738854797818723,\n\
\ \"acc_norm\": 0.28104575163398693,\n \"acc_norm_stderr\": 0.025738854797818723\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3633440514469453,\n\
\ \"acc_stderr\": 0.027316847674192707,\n \"acc_norm\": 0.3633440514469453,\n\
\ \"acc_norm_stderr\": 0.027316847674192707\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2716049382716049,\n \"acc_stderr\": 0.024748624490537365,\n\
\ \"acc_norm\": 0.2716049382716049,\n \"acc_norm_stderr\": 0.024748624490537365\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2553191489361702,\n \"acc_stderr\": 0.026011992930902006,\n \
\ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.026011992930902006\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.16176470588235295,\n \"acc_stderr\": 0.02236867256288675,\n\
\ \"acc_norm\": 0.16176470588235295,\n \"acc_norm_stderr\": 0.02236867256288675\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.29411764705882354,\n \"acc_stderr\": 0.018433427649401892,\n \
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.018433427649401892\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.19090909090909092,\n\
\ \"acc_stderr\": 0.03764425585984926,\n \"acc_norm\": 0.19090909090909092,\n\
\ \"acc_norm_stderr\": 0.03764425585984926\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.1836734693877551,\n \"acc_stderr\": 0.02478907133200767,\n\
\ \"acc_norm\": 0.1836734693877551,\n \"acc_norm_stderr\": 0.02478907133200767\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.31840796019900497,\n\
\ \"acc_stderr\": 0.032941184790540944,\n \"acc_norm\": 0.31840796019900497,\n\
\ \"acc_norm_stderr\": 0.032941184790540944\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110175,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110175\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.30120481927710846,\n\
\ \"acc_stderr\": 0.035716092300534796,\n \"acc_norm\": 0.30120481927710846,\n\
\ \"acc_norm_stderr\": 0.035716092300534796\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3391812865497076,\n \"acc_stderr\": 0.03631053496488905,\n\
\ \"acc_norm\": 0.3391812865497076,\n \"acc_norm_stderr\": 0.03631053496488905\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.27539779681762544,\n\
\ \"mc1_stderr\": 0.01563813566777552,\n \"mc2\": 0.41928517905056045,\n\
\ \"mc2_stderr\": 0.0152672030417133\n }\n}\n```"
repo_url: https://huggingface.co/RobbeD/Orca-Platypus-3B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|arc:challenge|25_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hellaswag|10_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T10:07:29.426848.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T10:07:29.426848.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T10:07:29.426848.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T10:07:29.426848.parquet'
- config_name: results
data_files:
- split: 2023_08_29T10_07_29.426848
path:
- results_2023-08-29T10:07:29.426848.parquet
- split: latest
path:
- results_2023-08-29T10:07:29.426848.parquet
---
# Dataset Card for Evaluation run of RobbeD/Orca-Platypus-3B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/RobbeD/Orca-Platypus-3B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [RobbeD/Orca-Platypus-3B](https://huggingface.co/RobbeD/Orca-Platypus-3B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_RobbeD__Orca-Platypus-3B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-29T10:07:29.426848](https://huggingface.co/datasets/open-llm-leaderboard/details_RobbeD__Orca-Platypus-3B/blob/main/results_2023-08-29T10%3A07%3A29.426848.json):
```python
{
"all": {
"acc": 0.27366722319077513,
"acc_stderr": 0.03210093803398038,
"acc_norm": 0.2768555704328155,
"acc_norm_stderr": 0.0320995646677269,
"mc1": 0.27539779681762544,
"mc1_stderr": 0.01563813566777552,
"mc2": 0.41928517905056045,
"mc2_stderr": 0.0152672030417133
},
"harness|arc:challenge|25": {
"acc": 0.3993174061433447,
"acc_stderr": 0.014312094557946707,
"acc_norm": 0.4308873720136519,
"acc_norm_stderr": 0.014471133392642476
},
"harness|hellaswag|10": {
"acc": 0.4967138020314678,
"acc_stderr": 0.004989673640014264,
"acc_norm": 0.6532563234415455,
"acc_norm_stderr": 0.004749606196363324
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.040943762699967926,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.040943762699967926
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.28289473684210525,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.28289473684210525,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2792452830188679,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.2792452830188679,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.15,
"acc_stderr": 0.035887028128263714,
"acc_norm": 0.15,
"acc_norm_stderr": 0.035887028128263714
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483098,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483098
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179963,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179963
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.25957446808510637,
"acc_stderr": 0.028659179374292323,
"acc_norm": 0.25957446808510637,
"acc_norm_stderr": 0.028659179374292323
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21929824561403508,
"acc_stderr": 0.03892431106518754,
"acc_norm": 0.21929824561403508,
"acc_norm_stderr": 0.03892431106518754
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2689655172413793,
"acc_stderr": 0.036951833116502325,
"acc_norm": 0.2689655172413793,
"acc_norm_stderr": 0.036951833116502325
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.19047619047619047,
"acc_stderr": 0.03512207412302052,
"acc_norm": 0.19047619047619047,
"acc_norm_stderr": 0.03512207412302052
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.22258064516129034,
"acc_stderr": 0.023664216671642535,
"acc_norm": 0.22258064516129034,
"acc_norm_stderr": 0.023664216671642535
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2315270935960591,
"acc_stderr": 0.029678333141444437,
"acc_norm": 0.2315270935960591,
"acc_norm_stderr": 0.029678333141444437
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.30303030303030304,
"acc_stderr": 0.035886248000917075,
"acc_norm": 0.30303030303030304,
"acc_norm_stderr": 0.035886248000917075
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.03173071239071724,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.03173071239071724
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.25906735751295334,
"acc_stderr": 0.031618779179354094,
"acc_norm": 0.25906735751295334,
"acc_norm_stderr": 0.031618779179354094
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.23846153846153847,
"acc_stderr": 0.021606294494647727,
"acc_norm": 0.23846153846153847,
"acc_norm_stderr": 0.021606294494647727
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712163,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712163
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.22268907563025211,
"acc_stderr": 0.027025433498882364,
"acc_norm": 0.22268907563025211,
"acc_norm_stderr": 0.027025433498882364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24403669724770644,
"acc_stderr": 0.01841528635141641,
"acc_norm": 0.24403669724770644,
"acc_norm_stderr": 0.01841528635141641
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.20833333333333334,
"acc_stderr": 0.027696910713093936,
"acc_norm": 0.20833333333333334,
"acc_norm_stderr": 0.027696910713093936
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24019607843137256,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.24019607843137256,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.028756799629658342,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.028756799629658342
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.2600896860986547,
"acc_stderr": 0.029442495585857476,
"acc_norm": 0.2600896860986547,
"acc_norm_stderr": 0.029442495585857476
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.20610687022900764,
"acc_stderr": 0.035477710041594626,
"acc_norm": 0.20610687022900764,
"acc_norm_stderr": 0.035477710041594626
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.4049586776859504,
"acc_stderr": 0.044811377559424694,
"acc_norm": 0.4049586776859504,
"acc_norm_stderr": 0.044811377559424694
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.044531975073749834,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.044531975073749834
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3128834355828221,
"acc_stderr": 0.036429145782924055,
"acc_norm": 0.3128834355828221,
"acc_norm_stderr": 0.036429145782924055
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.19642857142857142,
"acc_stderr": 0.03770970049347018,
"acc_norm": 0.19642857142857142,
"acc_norm_stderr": 0.03770970049347018
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690877,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690877
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3076923076923077,
"acc_stderr": 0.03023638994217309,
"acc_norm": 0.3076923076923077,
"acc_norm_stderr": 0.03023638994217309
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.34738186462324394,
"acc_stderr": 0.01702667174865574,
"acc_norm": 0.34738186462324394,
"acc_norm_stderr": 0.01702667174865574
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.30346820809248554,
"acc_stderr": 0.02475241196091721,
"acc_norm": 0.30346820809248554,
"acc_norm_stderr": 0.02475241196091721
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2446927374301676,
"acc_stderr": 0.014378169884098447,
"acc_norm": 0.2446927374301676,
"acc_norm_stderr": 0.014378169884098447
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.28104575163398693,
"acc_stderr": 0.025738854797818723,
"acc_norm": 0.28104575163398693,
"acc_norm_stderr": 0.025738854797818723
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3633440514469453,
"acc_stderr": 0.027316847674192707,
"acc_norm": 0.3633440514469453,
"acc_norm_stderr": 0.027316847674192707
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2716049382716049,
"acc_stderr": 0.024748624490537365,
"acc_norm": 0.2716049382716049,
"acc_norm_stderr": 0.024748624490537365
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.026011992930902006,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.026011992930902006
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.16176470588235295,
"acc_stderr": 0.02236867256288675,
"acc_norm": 0.16176470588235295,
"acc_norm_stderr": 0.02236867256288675
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.018433427649401892,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.018433427649401892
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.19090909090909092,
"acc_stderr": 0.03764425585984926,
"acc_norm": 0.19090909090909092,
"acc_norm_stderr": 0.03764425585984926
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.1836734693877551,
"acc_stderr": 0.02478907133200767,
"acc_norm": 0.1836734693877551,
"acc_norm_stderr": 0.02478907133200767
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.31840796019900497,
"acc_stderr": 0.032941184790540944,
"acc_norm": 0.31840796019900497,
"acc_norm_stderr": 0.032941184790540944
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110175,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110175
},
"harness|hendrycksTest-virology|5": {
"acc": 0.30120481927710846,
"acc_stderr": 0.035716092300534796,
"acc_norm": 0.30120481927710846,
"acc_norm_stderr": 0.035716092300534796
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3391812865497076,
"acc_stderr": 0.03631053496488905,
"acc_norm": 0.3391812865497076,
"acc_norm_stderr": 0.03631053496488905
},
"harness|truthfulqa:mc|0": {
"mc1": 0.27539779681762544,
"mc1_stderr": 0.01563813566777552,
"mc2": 0.41928517905056045,
"mc2_stderr": 0.0152672030417133
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_RobbeD__OpenLlama-Platypus-3B | 2023-09-23T06:28:25.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of RobbeD/OpenLlama-Platypus-3B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [RobbeD/OpenLlama-Platypus-3B](https://huggingface.co/RobbeD/OpenLlama-Platypus-3B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_RobbeD__OpenLlama-Platypus-3B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T06:28:14.000432](https://huggingface.co/datasets/open-llm-leaderboard/details_RobbeD__OpenLlama-Platypus-3B/blob/main/results_2023-09-23T06-28-14.000432.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.06145134228187919,\n\
\ \"em_stderr\": 0.002459425856611146,\n \"f1\": 0.11012269295302003,\n\
\ \"f1_stderr\": 0.002656818706713483,\n \"acc\": 0.3355993065948289,\n\
\ \"acc_stderr\": 0.008117942480603072\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.06145134228187919,\n \"em_stderr\": 0.002459425856611146,\n\
\ \"f1\": 0.11012269295302003,\n \"f1_stderr\": 0.002656818706713483\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.011372251705837756,\n \
\ \"acc_stderr\": 0.0029206661987887473\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.65982636148382,\n \"acc_stderr\": 0.013315218762417397\n\
\ }\n}\n```"
repo_url: https://huggingface.co/RobbeD/OpenLlama-Platypus-3B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|arc:challenge|25_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T06_28_14.000432
path:
- '**/details_harness|drop|3_2023-09-23T06-28-14.000432.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T06-28-14.000432.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T06_28_14.000432
path:
- '**/details_harness|gsm8k|5_2023-09-23T06-28-14.000432.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T06-28-14.000432.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hellaswag|10_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T10:12:53.419020.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T10:12:53.419020.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T10:12:53.419020.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T06_28_14.000432
path:
- '**/details_harness|winogrande|5_2023-09-23T06-28-14.000432.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T06-28-14.000432.parquet'
- config_name: results
data_files:
- split: 2023_08_29T10_12_53.419020
path:
- results_2023-08-29T10:12:53.419020.parquet
- split: 2023_09_23T06_28_14.000432
path:
- results_2023-09-23T06-28-14.000432.parquet
- split: latest
path:
- results_2023-09-23T06-28-14.000432.parquet
---
# Dataset Card for Evaluation run of RobbeD/OpenLlama-Platypus-3B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/RobbeD/OpenLlama-Platypus-3B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [RobbeD/OpenLlama-Platypus-3B](https://huggingface.co/RobbeD/OpenLlama-Platypus-3B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_RobbeD__OpenLlama-Platypus-3B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T06:28:14.000432](https://huggingface.co/datasets/open-llm-leaderboard/details_RobbeD__OpenLlama-Platypus-3B/blob/main/results_2023-09-23T06-28-14.000432.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.06145134228187919,
"em_stderr": 0.002459425856611146,
"f1": 0.11012269295302003,
"f1_stderr": 0.002656818706713483,
"acc": 0.3355993065948289,
"acc_stderr": 0.008117942480603072
},
"harness|drop|3": {
"em": 0.06145134228187919,
"em_stderr": 0.002459425856611146,
"f1": 0.11012269295302003,
"f1_stderr": 0.002656818706713483
},
"harness|gsm8k|5": {
"acc": 0.011372251705837756,
"acc_stderr": 0.0029206661987887473
},
"harness|winogrande|5": {
"acc": 0.65982636148382,
"acc_stderr": 0.013315218762417397
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_nathan0__mpt_delta_tuned_model_v3 | 2023-08-29T18:55:12.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of nathan0/mpt_delta_tuned_model_v3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [nathan0/mpt_delta_tuned_model_v3](https://huggingface.co/nathan0/mpt_delta_tuned_model_v3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nathan0__mpt_delta_tuned_model_v3\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-29T18:53:57.396321](https://huggingface.co/datasets/open-llm-leaderboard/details_nathan0__mpt_delta_tuned_model_v3/blob/main/results_2023-08-29T18%3A53%3A57.396321.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.28112521141201186,\n\
\ \"acc_stderr\": 0.032405505734312466,\n \"acc_norm\": 0.2851491508040904,\n\
\ \"acc_norm_stderr\": 0.03239478354615427,\n \"mc1\": 0.23990208078335373,\n\
\ \"mc1_stderr\": 0.014948812679062133,\n \"mc2\": 0.35460998683456907,\n\
\ \"mc2_stderr\": 0.013780749850644137\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.454778156996587,\n \"acc_stderr\": 0.014551507060836353,\n\
\ \"acc_norm\": 0.5059726962457338,\n \"acc_norm_stderr\": 0.014610348300255795\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5777733519219279,\n\
\ \"acc_stderr\": 0.004929048482760455,\n \"acc_norm\": 0.7639912368054173,\n\
\ \"acc_norm_stderr\": 0.004237598142007246\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2074074074074074,\n\
\ \"acc_stderr\": 0.03502553170678318,\n \"acc_norm\": 0.2074074074074074,\n\
\ \"acc_norm_stderr\": 0.03502553170678318\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2565789473684211,\n \"acc_stderr\": 0.0355418036802569,\n\
\ \"acc_norm\": 0.2565789473684211,\n \"acc_norm_stderr\": 0.0355418036802569\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.04408440022768076,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.04408440022768076\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.27169811320754716,\n \"acc_stderr\": 0.027377706624670713,\n\
\ \"acc_norm\": 0.27169811320754716,\n \"acc_norm_stderr\": 0.027377706624670713\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\
: 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2138728323699422,\n\
\ \"acc_stderr\": 0.031265112061730424,\n \"acc_norm\": 0.2138728323699422,\n\
\ \"acc_norm_stderr\": 0.031265112061730424\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179961,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179961\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.28085106382978725,\n \"acc_stderr\": 0.029379170464124825,\n\
\ \"acc_norm\": 0.28085106382978725,\n \"acc_norm_stderr\": 0.029379170464124825\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.03999423879281334,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.03999423879281334\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2689655172413793,\n \"acc_stderr\": 0.03695183311650232,\n\
\ \"acc_norm\": 0.2689655172413793,\n \"acc_norm_stderr\": 0.03695183311650232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2724867724867725,\n \"acc_stderr\": 0.022930973071633345,\n \"\
acc_norm\": 0.2724867724867725,\n \"acc_norm_stderr\": 0.022930973071633345\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.19047619047619047,\n\
\ \"acc_stderr\": 0.035122074123020514,\n \"acc_norm\": 0.19047619047619047,\n\
\ \"acc_norm_stderr\": 0.035122074123020514\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.3161290322580645,\n \"acc_stderr\": 0.02645087448904276,\n \"\
acc_norm\": 0.3161290322580645,\n \"acc_norm_stderr\": 0.02645087448904276\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2512315270935961,\n \"acc_stderr\": 0.030516530732694433,\n \"\
acc_norm\": 0.2512315270935961,\n \"acc_norm_stderr\": 0.030516530732694433\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\"\
: 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2545454545454545,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.22727272727272727,\n \"acc_stderr\": 0.029857515673386407,\n \"\
acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.029857515673386407\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.30569948186528495,\n \"acc_stderr\": 0.033248379397581594,\n\
\ \"acc_norm\": 0.30569948186528495,\n \"acc_norm_stderr\": 0.033248379397581594\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2948717948717949,\n \"acc_stderr\": 0.023119362758232287,\n\
\ \"acc_norm\": 0.2948717948717949,\n \"acc_norm_stderr\": 0.023119362758232287\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712177,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712177\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.25210084033613445,\n \"acc_stderr\": 0.028205545033277733,\n\
\ \"acc_norm\": 0.25210084033613445,\n \"acc_norm_stderr\": 0.028205545033277733\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2119205298013245,\n \"acc_stderr\": 0.03336767086567978,\n \"\
acc_norm\": 0.2119205298013245,\n \"acc_norm_stderr\": 0.03336767086567978\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.27889908256880735,\n \"acc_stderr\": 0.019227468876463514,\n \"\
acc_norm\": 0.27889908256880735,\n \"acc_norm_stderr\": 0.019227468876463514\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.18518518518518517,\n \"acc_stderr\": 0.026491914727355147,\n \"\
acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.026491914727355147\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2647058823529412,\n \"acc_stderr\": 0.0309645179269234,\n \"acc_norm\"\
: 0.2647058823529412,\n \"acc_norm_stderr\": 0.0309645179269234\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.2742616033755274,\n \"acc_stderr\": 0.029041333510598035,\n \"\
acc_norm\": 0.2742616033755274,\n \"acc_norm_stderr\": 0.029041333510598035\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3721973094170404,\n\
\ \"acc_stderr\": 0.03244305283008731,\n \"acc_norm\": 0.3721973094170404,\n\
\ \"acc_norm_stderr\": 0.03244305283008731\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2900763358778626,\n \"acc_stderr\": 0.03980066246467765,\n\
\ \"acc_norm\": 0.2900763358778626,\n \"acc_norm_stderr\": 0.03980066246467765\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2809917355371901,\n \"acc_stderr\": 0.04103203830514512,\n \"\
acc_norm\": 0.2809917355371901,\n \"acc_norm_stderr\": 0.04103203830514512\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.32407407407407407,\n\
\ \"acc_stderr\": 0.04524596007030049,\n \"acc_norm\": 0.32407407407407407,\n\
\ \"acc_norm_stderr\": 0.04524596007030049\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2392638036809816,\n \"acc_stderr\": 0.03351953879521269,\n\
\ \"acc_norm\": 0.2392638036809816,\n \"acc_norm_stderr\": 0.03351953879521269\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.046355501356099754,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.046355501356099754\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.029745048572674033,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.029745048572674033\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.29757343550446996,\n\
\ \"acc_stderr\": 0.01634911191290943,\n \"acc_norm\": 0.29757343550446996,\n\
\ \"acc_norm_stderr\": 0.01634911191290943\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2861271676300578,\n \"acc_stderr\": 0.02433214677913413,\n\
\ \"acc_norm\": 0.2861271676300578,\n \"acc_norm_stderr\": 0.02433214677913413\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n\
\ \"acc_stderr\": 0.014355911964767864,\n \"acc_norm\": 0.2435754189944134,\n\
\ \"acc_norm_stderr\": 0.014355911964767864\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.28104575163398693,\n \"acc_stderr\": 0.025738854797818702,\n\
\ \"acc_norm\": 0.28104575163398693,\n \"acc_norm_stderr\": 0.025738854797818702\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2861736334405145,\n\
\ \"acc_stderr\": 0.02567025924218894,\n \"acc_norm\": 0.2861736334405145,\n\
\ \"acc_norm_stderr\": 0.02567025924218894\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2932098765432099,\n \"acc_stderr\": 0.02532988817190092,\n\
\ \"acc_norm\": 0.2932098765432099,\n \"acc_norm_stderr\": 0.02532988817190092\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3120567375886525,\n \"acc_stderr\": 0.027640120545169927,\n \
\ \"acc_norm\": 0.3120567375886525,\n \"acc_norm_stderr\": 0.027640120545169927\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.26597131681877445,\n\
\ \"acc_stderr\": 0.01128503316555127,\n \"acc_norm\": 0.26597131681877445,\n\
\ \"acc_norm_stderr\": 0.01128503316555127\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.20220588235294118,\n \"acc_stderr\": 0.02439819298665492,\n\
\ \"acc_norm\": 0.20220588235294118,\n \"acc_norm_stderr\": 0.02439819298665492\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2696078431372549,\n \"acc_stderr\": 0.017952449196987862,\n \
\ \"acc_norm\": 0.2696078431372549,\n \"acc_norm_stderr\": 0.017952449196987862\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.35454545454545455,\n\
\ \"acc_stderr\": 0.04582004841505416,\n \"acc_norm\": 0.35454545454545455,\n\
\ \"acc_norm_stderr\": 0.04582004841505416\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2897959183673469,\n \"acc_stderr\": 0.02904308868330433,\n\
\ \"acc_norm\": 0.2897959183673469,\n \"acc_norm_stderr\": 0.02904308868330433\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23383084577114427,\n\
\ \"acc_stderr\": 0.02992941540834839,\n \"acc_norm\": 0.23383084577114427,\n\
\ \"acc_norm_stderr\": 0.02992941540834839\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3132530120481928,\n\
\ \"acc_stderr\": 0.036108050180310235,\n \"acc_norm\": 0.3132530120481928,\n\
\ \"acc_norm_stderr\": 0.036108050180310235\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.30409356725146197,\n \"acc_stderr\": 0.03528211258245232,\n\
\ \"acc_norm\": 0.30409356725146197,\n \"acc_norm_stderr\": 0.03528211258245232\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23990208078335373,\n\
\ \"mc1_stderr\": 0.014948812679062133,\n \"mc2\": 0.35460998683456907,\n\
\ \"mc2_stderr\": 0.013780749850644137\n }\n}\n```"
repo_url: https://huggingface.co/nathan0/mpt_delta_tuned_model_v3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|arc:challenge|25_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|arc:challenge|25_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hellaswag|10_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hellaswag|10_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T10:14:18.725363.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T18:53:57.396321.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T18:53:57.396321.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T10:14:18.725363.parquet'
- split: 2023_08_29T18_53_57.396321
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T18:53:57.396321.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T18:53:57.396321.parquet'
- config_name: results
data_files:
- split: 2023_08_29T10_14_18.725363
path:
- results_2023-08-29T10:14:18.725363.parquet
- split: 2023_08_29T18_53_57.396321
path:
- results_2023-08-29T18:53:57.396321.parquet
- split: latest
path:
- results_2023-08-29T18:53:57.396321.parquet
---
# Dataset Card for Evaluation run of nathan0/mpt_delta_tuned_model_v3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/nathan0/mpt_delta_tuned_model_v3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [nathan0/mpt_delta_tuned_model_v3](https://huggingface.co/nathan0/mpt_delta_tuned_model_v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nathan0__mpt_delta_tuned_model_v3",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-29T18:53:57.396321](https://huggingface.co/datasets/open-llm-leaderboard/details_nathan0__mpt_delta_tuned_model_v3/blob/main/results_2023-08-29T18%3A53%3A57.396321.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.28112521141201186,
"acc_stderr": 0.032405505734312466,
"acc_norm": 0.2851491508040904,
"acc_norm_stderr": 0.03239478354615427,
"mc1": 0.23990208078335373,
"mc1_stderr": 0.014948812679062133,
"mc2": 0.35460998683456907,
"mc2_stderr": 0.013780749850644137
},
"harness|arc:challenge|25": {
"acc": 0.454778156996587,
"acc_stderr": 0.014551507060836353,
"acc_norm": 0.5059726962457338,
"acc_norm_stderr": 0.014610348300255795
},
"harness|hellaswag|10": {
"acc": 0.5777733519219279,
"acc_stderr": 0.004929048482760455,
"acc_norm": 0.7639912368054173,
"acc_norm_stderr": 0.004237598142007246
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2074074074074074,
"acc_stderr": 0.03502553170678318,
"acc_norm": 0.2074074074074074,
"acc_norm_stderr": 0.03502553170678318
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2565789473684211,
"acc_stderr": 0.0355418036802569,
"acc_norm": 0.2565789473684211,
"acc_norm_stderr": 0.0355418036802569
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768076,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768076
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.27169811320754716,
"acc_stderr": 0.027377706624670713,
"acc_norm": 0.27169811320754716,
"acc_norm_stderr": 0.027377706624670713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.031265112061730424,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.031265112061730424
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179961,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179961
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.28085106382978725,
"acc_stderr": 0.029379170464124825,
"acc_norm": 0.28085106382978725,
"acc_norm_stderr": 0.029379170464124825
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.03999423879281334,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.03999423879281334
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2689655172413793,
"acc_stderr": 0.03695183311650232,
"acc_norm": 0.2689655172413793,
"acc_norm_stderr": 0.03695183311650232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2724867724867725,
"acc_stderr": 0.022930973071633345,
"acc_norm": 0.2724867724867725,
"acc_norm_stderr": 0.022930973071633345
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.19047619047619047,
"acc_stderr": 0.035122074123020514,
"acc_norm": 0.19047619047619047,
"acc_norm_stderr": 0.035122074123020514
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3161290322580645,
"acc_stderr": 0.02645087448904276,
"acc_norm": 0.3161290322580645,
"acc_norm_stderr": 0.02645087448904276
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2512315270935961,
"acc_stderr": 0.030516530732694433,
"acc_norm": 0.2512315270935961,
"acc_norm_stderr": 0.030516530732694433
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.029857515673386407,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.029857515673386407
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.30569948186528495,
"acc_stderr": 0.033248379397581594,
"acc_norm": 0.30569948186528495,
"acc_norm_stderr": 0.033248379397581594
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2948717948717949,
"acc_stderr": 0.023119362758232287,
"acc_norm": 0.2948717948717949,
"acc_norm_stderr": 0.023119362758232287
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712177,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712177
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.25210084033613445,
"acc_stderr": 0.028205545033277733,
"acc_norm": 0.25210084033613445,
"acc_norm_stderr": 0.028205545033277733
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2119205298013245,
"acc_stderr": 0.03336767086567978,
"acc_norm": 0.2119205298013245,
"acc_norm_stderr": 0.03336767086567978
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.27889908256880735,
"acc_stderr": 0.019227468876463514,
"acc_norm": 0.27889908256880735,
"acc_norm_stderr": 0.019227468876463514
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.026491914727355147,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.026491914727355147
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.0309645179269234,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.0309645179269234
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2742616033755274,
"acc_stderr": 0.029041333510598035,
"acc_norm": 0.2742616033755274,
"acc_norm_stderr": 0.029041333510598035
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3721973094170404,
"acc_stderr": 0.03244305283008731,
"acc_norm": 0.3721973094170404,
"acc_norm_stderr": 0.03244305283008731
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2900763358778626,
"acc_stderr": 0.03980066246467765,
"acc_norm": 0.2900763358778626,
"acc_norm_stderr": 0.03980066246467765
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2809917355371901,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.2809917355371901,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.32407407407407407,
"acc_stderr": 0.04524596007030049,
"acc_norm": 0.32407407407407407,
"acc_norm_stderr": 0.04524596007030049
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2392638036809816,
"acc_stderr": 0.03351953879521269,
"acc_norm": 0.2392638036809816,
"acc_norm_stderr": 0.03351953879521269
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.046355501356099754,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.046355501356099754
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.029745048572674033,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.029745048572674033
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.29757343550446996,
"acc_stderr": 0.01634911191290943,
"acc_norm": 0.29757343550446996,
"acc_norm_stderr": 0.01634911191290943
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2861271676300578,
"acc_stderr": 0.02433214677913413,
"acc_norm": 0.2861271676300578,
"acc_norm_stderr": 0.02433214677913413
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2435754189944134,
"acc_stderr": 0.014355911964767864,
"acc_norm": 0.2435754189944134,
"acc_norm_stderr": 0.014355911964767864
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.28104575163398693,
"acc_stderr": 0.025738854797818702,
"acc_norm": 0.28104575163398693,
"acc_norm_stderr": 0.025738854797818702
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2861736334405145,
"acc_stderr": 0.02567025924218894,
"acc_norm": 0.2861736334405145,
"acc_norm_stderr": 0.02567025924218894
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2932098765432099,
"acc_stderr": 0.02532988817190092,
"acc_norm": 0.2932098765432099,
"acc_norm_stderr": 0.02532988817190092
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3120567375886525,
"acc_stderr": 0.027640120545169927,
"acc_norm": 0.3120567375886525,
"acc_norm_stderr": 0.027640120545169927
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.26597131681877445,
"acc_stderr": 0.01128503316555127,
"acc_norm": 0.26597131681877445,
"acc_norm_stderr": 0.01128503316555127
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20220588235294118,
"acc_stderr": 0.02439819298665492,
"acc_norm": 0.20220588235294118,
"acc_norm_stderr": 0.02439819298665492
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2696078431372549,
"acc_stderr": 0.017952449196987862,
"acc_norm": 0.2696078431372549,
"acc_norm_stderr": 0.017952449196987862
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.35454545454545455,
"acc_stderr": 0.04582004841505416,
"acc_norm": 0.35454545454545455,
"acc_norm_stderr": 0.04582004841505416
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2897959183673469,
"acc_stderr": 0.02904308868330433,
"acc_norm": 0.2897959183673469,
"acc_norm_stderr": 0.02904308868330433
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23383084577114427,
"acc_stderr": 0.02992941540834839,
"acc_norm": 0.23383084577114427,
"acc_norm_stderr": 0.02992941540834839
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3132530120481928,
"acc_stderr": 0.036108050180310235,
"acc_norm": 0.3132530120481928,
"acc_norm_stderr": 0.036108050180310235
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.30409356725146197,
"acc_stderr": 0.03528211258245232,
"acc_norm": 0.30409356725146197,
"acc_norm_stderr": 0.03528211258245232
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23990208078335373,
"mc1_stderr": 0.014948812679062133,
"mc2": 0.35460998683456907,
"mc2_stderr": 0.013780749850644137
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
KatMarie/eu_test3 | 2023-08-29T10:21:16.000Z | [
"region:us"
] | KatMarie | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 302617
num_examples: 5172
download_size: 207896
dataset_size: 302617
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "eu_test3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tollefj/texts-en-no-comp_0.8_min_4 | 2023-08-29T10:22:11.000Z | [
"region:us"
] | tollefj | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: en
dtype: string
- name: 'no'
dtype: string
- name: cosine_sim
dtype: float64
splits:
- name: train
num_bytes: 269477510
num_examples: 2865364
download_size: 201333011
dataset_size: 269477510
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "texts-en-no-comp_0.8_min_4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KatMarie/eu_test4 | 2023-08-29T10:23:20.000Z | [
"region:us"
] | KatMarie | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 302617
num_examples: 5172
download_size: 207896
dataset_size: 302617
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "eu_test4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
spsither/whisper_large_r1_ts1693219076.0_prepare_dataset | 2023-08-30T11:34:50.000Z | [
"region:us"
] | spsither | null | null | null | 0 | 0 | Entry not found |
tollefj/texts-en-no-comp_0.8_min_10 | 2023-08-29T10:40:13.000Z | [
"region:us"
] | tollefj | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: en
dtype: string
- name: 'no'
dtype: string
- name: cosine_sim
dtype: float64
splits:
- name: train
num_bytes: 145691139
num_examples: 1072945
download_size: 106042871
dataset_size: 145691139
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "texts-en-no-comp_0.8_min_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tollefj/texts-en-no-comp_0.8_min_20 | 2023-08-29T10:42:39.000Z | [
"region:us"
] | tollefj | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: en
dtype: string
- name: 'no'
dtype: string
- name: cosine_sim
dtype: float64
splits:
- name: train
num_bytes: 31550085
num_examples: 142806
download_size: 22271228
dataset_size: 31550085
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "texts-en-no-comp_0.8_min_20"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
blackbingris/3221321 | 2023-08-29T10:43:45.000Z | [
"region:us"
] | blackbingris | null | null | null | 0 | 0 | Entry not found |
chunpingvi/full_poem | 2023-09-26T15:02:59.000Z | [
"region:us"
] | chunpingvi | null | null | null | 0 | 0 | Entry not found |
CreatorPhan/Wiki_200 | 2023-08-30T08:08:21.000Z | [
"region:us"
] | CreatorPhan | null | null | null | 0 | 0 | Entry not found |
KatMarie/eu_test5 | 2023-08-29T11:19:32.000Z | [
"region:us"
] | KatMarie | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 307789
num_examples: 5172
download_size: 208326
dataset_size: 307789
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "eu_test5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yardeny/mlm_test_set_context_len_128 | 2023-08-29T10:58:51.000Z | [
"region:us"
] | yardeny | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 499200
num_examples: 640
download_size: 183124
dataset_size: 499200
---
# Dataset Card for "loss_landscape_test_set_context_len_128"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
xontoloyoo/mymodel | 2023-09-01T17:10:38.000Z | [
"license:creativeml-openrail-m",
"region:us"
] | xontoloyoo | null | null | null | 0 | 0 | ---
license: creativeml-openrail-m
---
|
yzhuang/autotree_automl_MagicTelescope_gosdt_l512_d3_sd2 | 2023-08-29T11:04:23.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | Entry not found |
quocanh34/result_with_w2v2_baseline | 2023-08-29T11:08:59.000Z | [
"region:us"
] | quocanh34 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: id
dtype: string
- name: w2v2_baseline_transcription
dtype: string
- name: w2v2_baseline_norm
dtype: string
splits:
- name: train
num_bytes: 174371487.625
num_examples: 1299
download_size: 164231228
dataset_size: 174371487.625
---
# Dataset Card for "result_with_w2v2_baseline"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_ai-business__Luban-13B | 2023-09-17T11:15:46.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of ai-business/Luban-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ai-business/Luban-13B](https://huggingface.co/ai-business/Luban-13B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ai-business__Luban-13B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T11:15:33.793306](https://huggingface.co/datasets/open-llm-leaderboard/details_ai-business__Luban-13B/blob/main/results_2023-09-17T11-15-33.793306.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.007340604026845637,\n\
\ \"em_stderr\": 0.0008741896875346207,\n \"f1\": 0.10464869966443034,\n\
\ \"f1_stderr\": 0.0019947106278579182,\n \"acc\": 0.431315608856773,\n\
\ \"acc_stderr\": 0.010029949190396351\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.007340604026845637,\n \"em_stderr\": 0.0008741896875346207,\n\
\ \"f1\": 0.10464869966443034,\n \"f1_stderr\": 0.0019947106278579182\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09704321455648218,\n \
\ \"acc_stderr\": 0.008153768274554716\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7655880031570639,\n \"acc_stderr\": 0.011906130106237985\n\
\ }\n}\n```"
repo_url: https://huggingface.co/ai-business/Luban-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T11_15_33.793306
path:
- '**/details_harness|drop|3_2023-09-17T11-15-33.793306.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T11-15-33.793306.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T11_15_33.793306
path:
- '**/details_harness|gsm8k|5_2023-09-17T11-15-33.793306.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T11-15-33.793306.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T11_15_33.793306
path:
- '**/details_harness|winogrande|5_2023-09-17T11-15-33.793306.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T11-15-33.793306.parquet'
- config_name: results
data_files:
- split: 2023_08_29T11_08_27.769283
path:
- results_2023-08-29T11:08:27.769283.parquet
- split: 2023_09_17T11_15_33.793306
path:
- results_2023-09-17T11-15-33.793306.parquet
- split: latest
path:
- results_2023-09-17T11-15-33.793306.parquet
---
# Dataset Card for Evaluation run of ai-business/Luban-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ai-business/Luban-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ai-business/Luban-13B](https://huggingface.co/ai-business/Luban-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ai-business__Luban-13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T11:15:33.793306](https://huggingface.co/datasets/open-llm-leaderboard/details_ai-business__Luban-13B/blob/main/results_2023-09-17T11-15-33.793306.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.007340604026845637,
"em_stderr": 0.0008741896875346207,
"f1": 0.10464869966443034,
"f1_stderr": 0.0019947106278579182,
"acc": 0.431315608856773,
"acc_stderr": 0.010029949190396351
},
"harness|drop|3": {
"em": 0.007340604026845637,
"em_stderr": 0.0008741896875346207,
"f1": 0.10464869966443034,
"f1_stderr": 0.0019947106278579182
},
"harness|gsm8k|5": {
"acc": 0.09704321455648218,
"acc_stderr": 0.008153768274554716
},
"harness|winogrande|5": {
"acc": 0.7655880031570639,
"acc_stderr": 0.011906130106237985
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
temasarkisov/SolidLogosID_converted_processed_V2 | 2023-08-29T11:13:25.000Z | [
"region:us"
] | temasarkisov | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 1031221.0
num_examples: 48
download_size: 1031152
dataset_size: 1031221.0
---
# Dataset Card for "SolidLogosID_converted_processed_V2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yzhuang/autotree_automl_eye_movements_gosdt_l512_d3 | 2023-08-29T11:17:14.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 10863200000
num_examples: 100000
- name: validation
num_bytes: 1086320000
num_examples: 10000
download_size: 2708670915
dataset_size: 11949520000
---
# Dataset Card for "autotree_automl_eye_movements_gosdt_l512_d3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HydraLM/mix-gpt4-6k-camel-rlhf-fixed-standardized | 2023-08-30T20:31:36.000Z | [
"region:us"
] | HydraLM | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
splits:
- name: train
num_bytes: 51028880
num_examples: 47010
- name: test
num_bytes: 3058844
num_examples: 2716
download_size: 25724863
dataset_size: 54087724
---
# Dataset Card for "mix-gpt4-6k-camel-rlhf-fixed-standardized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Max5ive/tendergpt-training-dataset | 2023-08-29T11:28:06.000Z | [
"license:apache-2.0",
"region:us"
] | Max5ive | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
yardeny/mlm_test_set_context_len_256 | 2023-08-29T11:20:07.000Z | [
"region:us"
] | yardeny | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 495360
num_examples: 320
download_size: 183583
dataset_size: 495360
---
# Dataset Card for "loss_landscape_test_set_context_len_256"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_yeontaek__Platypus2-13B-LoRa-v2 | 2023-08-29T11:22:55.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of yeontaek/Platypus2-13B-LoRa-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yeontaek/Platypus2-13B-LoRa-v2](https://huggingface.co/yeontaek/Platypus2-13B-LoRa-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yeontaek__Platypus2-13B-LoRa-v2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-29T11:20:59.240376](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__Platypus2-13B-LoRa-v2/blob/main/results_2023-08-29T11%3A20%3A59.240376.json):\n\
\n```python\n{\n \"all\": {\n \"acc\": 0.571991245483798,\n \"\
acc_stderr\": 0.034294067141786025,\n \"acc_norm\": 0.5761375119651778,\n\
\ \"acc_norm_stderr\": 0.03427336583128381,\n \"mc1\": 0.28151774785801714,\n\
\ \"mc1_stderr\": 0.01574402724825605,\n \"mc2\": 0.4191985438925104,\n\
\ \"mc2_stderr\": 0.014270484892545822\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5563139931740614,\n \"acc_stderr\": 0.014518421825670444,\n\
\ \"acc_norm\": 0.5947098976109215,\n \"acc_norm_stderr\": 0.014346869060229328\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6179047998406691,\n\
\ \"acc_stderr\": 0.004849065962692132,\n \"acc_norm\": 0.8241386178052181,\n\
\ \"acc_norm_stderr\": 0.003799241408502969\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5986842105263158,\n \"acc_stderr\": 0.039889037033362836,\n\
\ \"acc_norm\": 0.5986842105263158,\n \"acc_norm_stderr\": 0.039889037033362836\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6150943396226415,\n \"acc_stderr\": 0.02994649856769995,\n\
\ \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.02994649856769995\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6111111111111112,\n\
\ \"acc_stderr\": 0.04076663253918567,\n \"acc_norm\": 0.6111111111111112,\n\
\ \"acc_norm_stderr\": 0.04076663253918567\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n\
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5722543352601156,\n\
\ \"acc_stderr\": 0.03772446857518026,\n \"acc_norm\": 0.5722543352601156,\n\
\ \"acc_norm_stderr\": 0.03772446857518026\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.047240073523838876,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.047240073523838876\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4297872340425532,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.4297872340425532,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.043727482902780064,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.043727482902780064\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3386243386243386,\n \"acc_stderr\": 0.02437319786798306,\n \"\
acc_norm\": 0.3386243386243386,\n \"acc_norm_stderr\": 0.02437319786798306\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6709677419354839,\n\
\ \"acc_stderr\": 0.02672949906834996,\n \"acc_norm\": 0.6709677419354839,\n\
\ \"acc_norm_stderr\": 0.02672949906834996\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.03546563019624336,\n\
\ \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.03546563019624336\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7424242424242424,\n \"acc_stderr\": 0.031156269519646836,\n \"\
acc_norm\": 0.7424242424242424,\n \"acc_norm_stderr\": 0.031156269519646836\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8186528497409327,\n \"acc_stderr\": 0.02780703236068609,\n\
\ \"acc_norm\": 0.8186528497409327,\n \"acc_norm_stderr\": 0.02780703236068609\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5307692307692308,\n \"acc_stderr\": 0.025302958890850154,\n\
\ \"acc_norm\": 0.5307692307692308,\n \"acc_norm_stderr\": 0.025302958890850154\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.027309140588230172,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.027309140588230172\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6218487394957983,\n \"acc_stderr\": 0.031499305777849054,\n\
\ \"acc_norm\": 0.6218487394957983,\n \"acc_norm_stderr\": 0.031499305777849054\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7688073394495413,\n \"acc_stderr\": 0.01807575024163315,\n \"\
acc_norm\": 0.7688073394495413,\n \"acc_norm_stderr\": 0.01807575024163315\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7549019607843137,\n \"acc_stderr\": 0.03019028245350195,\n \"\
acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.03019028245350195\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159263,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159263\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n\
\ \"acc_stderr\": 0.03181149747055359,\n \"acc_norm\": 0.6591928251121076,\n\
\ \"acc_norm_stderr\": 0.03181149747055359\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6335877862595419,\n \"acc_stderr\": 0.04225875451969637,\n\
\ \"acc_norm\": 0.6335877862595419,\n \"acc_norm_stderr\": 0.04225875451969637\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7107438016528925,\n \"acc_stderr\": 0.04139112727635463,\n \"\
acc_norm\": 0.7107438016528925,\n \"acc_norm_stderr\": 0.04139112727635463\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6687116564417178,\n \"acc_stderr\": 0.03697983910025588,\n\
\ \"acc_norm\": 0.6687116564417178,\n \"acc_norm_stderr\": 0.03697983910025588\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.04432804055291517,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.04432804055291517\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.811965811965812,\n\
\ \"acc_stderr\": 0.025598193686652244,\n \"acc_norm\": 0.811965811965812,\n\
\ \"acc_norm_stderr\": 0.025598193686652244\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7611749680715197,\n\
\ \"acc_stderr\": 0.015246803197398682,\n \"acc_norm\": 0.7611749680715197,\n\
\ \"acc_norm_stderr\": 0.015246803197398682\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.025305258131879716,\n\
\ \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.025305258131879716\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41675977653631285,\n\
\ \"acc_stderr\": 0.016489134962438954,\n \"acc_norm\": 0.41675977653631285,\n\
\ \"acc_norm_stderr\": 0.016489134962438954\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6241830065359477,\n \"acc_stderr\": 0.027732834353363947,\n\
\ \"acc_norm\": 0.6241830065359477,\n \"acc_norm_stderr\": 0.027732834353363947\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.662379421221865,\n\
\ \"acc_stderr\": 0.02685882587948854,\n \"acc_norm\": 0.662379421221865,\n\
\ \"acc_norm_stderr\": 0.02685882587948854\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.654320987654321,\n \"acc_stderr\": 0.026462487777001872,\n\
\ \"acc_norm\": 0.654320987654321,\n \"acc_norm_stderr\": 0.026462487777001872\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46099290780141844,\n \"acc_stderr\": 0.02973659252642444,\n \
\ \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.02973659252642444\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4602346805736636,\n\
\ \"acc_stderr\": 0.01272978538659857,\n \"acc_norm\": 0.4602346805736636,\n\
\ \"acc_norm_stderr\": 0.01272978538659857\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5698529411764706,\n \"acc_stderr\": 0.030074971917302875,\n\
\ \"acc_norm\": 0.5698529411764706,\n \"acc_norm_stderr\": 0.030074971917302875\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5996732026143791,\n \"acc_stderr\": 0.01982184368827176,\n \
\ \"acc_norm\": 0.5996732026143791,\n \"acc_norm_stderr\": 0.01982184368827176\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5959183673469388,\n \"acc_stderr\": 0.031414708025865885,\n\
\ \"acc_norm\": 0.5959183673469388,\n \"acc_norm_stderr\": 0.031414708025865885\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.746268656716418,\n\
\ \"acc_stderr\": 0.03076944496729602,\n \"acc_norm\": 0.746268656716418,\n\
\ \"acc_norm_stderr\": 0.03076944496729602\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.031885780176863984,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.031885780176863984\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28151774785801714,\n\
\ \"mc1_stderr\": 0.01574402724825605,\n \"mc2\": 0.4191985438925104,\n\
\ \"mc2_stderr\": 0.014270484892545822\n }\n}\n```"
repo_url: https://huggingface.co/yeontaek/Platypus2-13B-LoRa-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|arc:challenge|25_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hellaswag|10_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T11:20:59.240376.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T11:20:59.240376.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T11:20:59.240376.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T11:20:59.240376.parquet'
- config_name: results
data_files:
- split: 2023_08_29T11_20_59.240376
path:
- results_2023-08-29T11:20:59.240376.parquet
- split: latest
path:
- results_2023-08-29T11:20:59.240376.parquet
---
# Dataset Card for Evaluation run of yeontaek/Platypus2-13B-LoRa-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/yeontaek/Platypus2-13B-LoRa-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [yeontaek/Platypus2-13B-LoRa-v2](https://huggingface.co/yeontaek/Platypus2-13B-LoRa-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yeontaek__Platypus2-13B-LoRa-v2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-29T11:20:59.240376](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__Platypus2-13B-LoRa-v2/blob/main/results_2023-08-29T11%3A20%3A59.240376.json):
```python
{
"all": {
"acc": 0.571991245483798,
"acc_stderr": 0.034294067141786025,
"acc_norm": 0.5761375119651778,
"acc_norm_stderr": 0.03427336583128381,
"mc1": 0.28151774785801714,
"mc1_stderr": 0.01574402724825605,
"mc2": 0.4191985438925104,
"mc2_stderr": 0.014270484892545822
},
"harness|arc:challenge|25": {
"acc": 0.5563139931740614,
"acc_stderr": 0.014518421825670444,
"acc_norm": 0.5947098976109215,
"acc_norm_stderr": 0.014346869060229328
},
"harness|hellaswag|10": {
"acc": 0.6179047998406691,
"acc_stderr": 0.004849065962692132,
"acc_norm": 0.8241386178052181,
"acc_norm_stderr": 0.003799241408502969
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5986842105263158,
"acc_stderr": 0.039889037033362836,
"acc_norm": 0.5986842105263158,
"acc_norm_stderr": 0.039889037033362836
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6150943396226415,
"acc_stderr": 0.02994649856769995,
"acc_norm": 0.6150943396226415,
"acc_norm_stderr": 0.02994649856769995
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.04076663253918567,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.04076663253918567
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5722543352601156,
"acc_stderr": 0.03772446857518026,
"acc_norm": 0.5722543352601156,
"acc_norm_stderr": 0.03772446857518026
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.047240073523838876,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.047240073523838876
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4297872340425532,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.4297872340425532,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.043727482902780064,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.043727482902780064
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3386243386243386,
"acc_stderr": 0.02437319786798306,
"acc_norm": 0.3386243386243386,
"acc_norm_stderr": 0.02437319786798306
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6709677419354839,
"acc_stderr": 0.02672949906834996,
"acc_norm": 0.6709677419354839,
"acc_norm_stderr": 0.02672949906834996
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.03546563019624336,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.03546563019624336
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7424242424242424,
"acc_stderr": 0.031156269519646836,
"acc_norm": 0.7424242424242424,
"acc_norm_stderr": 0.031156269519646836
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8186528497409327,
"acc_stderr": 0.02780703236068609,
"acc_norm": 0.8186528497409327,
"acc_norm_stderr": 0.02780703236068609
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5307692307692308,
"acc_stderr": 0.025302958890850154,
"acc_norm": 0.5307692307692308,
"acc_norm_stderr": 0.025302958890850154
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.027309140588230172,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.027309140588230172
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6218487394957983,
"acc_stderr": 0.031499305777849054,
"acc_norm": 0.6218487394957983,
"acc_norm_stderr": 0.031499305777849054
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7688073394495413,
"acc_stderr": 0.01807575024163315,
"acc_norm": 0.7688073394495413,
"acc_norm_stderr": 0.01807575024163315
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.03019028245350195,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.03019028245350195
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159263,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159263
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.03181149747055359,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.03181149747055359
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6335877862595419,
"acc_stderr": 0.04225875451969637,
"acc_norm": 0.6335877862595419,
"acc_norm_stderr": 0.04225875451969637
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7107438016528925,
"acc_stderr": 0.04139112727635463,
"acc_norm": 0.7107438016528925,
"acc_norm_stderr": 0.04139112727635463
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6687116564417178,
"acc_stderr": 0.03697983910025588,
"acc_norm": 0.6687116564417178,
"acc_norm_stderr": 0.03697983910025588
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.04432804055291517,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.04432804055291517
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.811965811965812,
"acc_stderr": 0.025598193686652244,
"acc_norm": 0.811965811965812,
"acc_norm_stderr": 0.025598193686652244
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7611749680715197,
"acc_stderr": 0.015246803197398682,
"acc_norm": 0.7611749680715197,
"acc_norm_stderr": 0.015246803197398682
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.025305258131879716,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.025305258131879716
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.41675977653631285,
"acc_stderr": 0.016489134962438954,
"acc_norm": 0.41675977653631285,
"acc_norm_stderr": 0.016489134962438954
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6241830065359477,
"acc_stderr": 0.027732834353363947,
"acc_norm": 0.6241830065359477,
"acc_norm_stderr": 0.027732834353363947
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.662379421221865,
"acc_stderr": 0.02685882587948854,
"acc_norm": 0.662379421221865,
"acc_norm_stderr": 0.02685882587948854
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.654320987654321,
"acc_stderr": 0.026462487777001872,
"acc_norm": 0.654320987654321,
"acc_norm_stderr": 0.026462487777001872
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.02973659252642444,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.02973659252642444
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4602346805736636,
"acc_stderr": 0.01272978538659857,
"acc_norm": 0.4602346805736636,
"acc_norm_stderr": 0.01272978538659857
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5698529411764706,
"acc_stderr": 0.030074971917302875,
"acc_norm": 0.5698529411764706,
"acc_norm_stderr": 0.030074971917302875
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5996732026143791,
"acc_stderr": 0.01982184368827176,
"acc_norm": 0.5996732026143791,
"acc_norm_stderr": 0.01982184368827176
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5959183673469388,
"acc_stderr": 0.031414708025865885,
"acc_norm": 0.5959183673469388,
"acc_norm_stderr": 0.031414708025865885
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.746268656716418,
"acc_stderr": 0.03076944496729602,
"acc_norm": 0.746268656716418,
"acc_norm_stderr": 0.03076944496729602
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.031885780176863984,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.031885780176863984
},
"harness|truthfulqa:mc|0": {
"mc1": 0.28151774785801714,
"mc1_stderr": 0.01574402724825605,
"mc2": 0.4191985438925104,
"mc2_stderr": 0.014270484892545822
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
BadreddineHug/LayoutLM_data_test | 2023-08-29T11:33:56.000Z | [
"region:us"
] | BadreddineHug | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: string
- name: image
dtype: image
- name: bboxes
sequence:
sequence: int64
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': Ref
'2': NumFa
'3': Fourniss
'4': DateFa
'5': DateLim
'6': TotalHT
'7': TVA
'8': TotalTTc
'9': unitP
'10': Qt
'11': TVAP
'12': descp
- name: tokens
sequence: string
splits:
- name: train
num_bytes: 2642222.4
num_examples: 8
- name: test
num_bytes: 660555.6
num_examples: 2
download_size: 3029516
dataset_size: 3302778.0
---
# Dataset Card for "LayoutLM_data_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Roboos/Harry-bot | 2023-08-29T11:36:34.000Z | [
"license:unknown",
"region:us"
] | Roboos | null | null | null | 0 | 0 | ---
license: unknown
---
|
cinnachromausa/cinnachroma | 2023-08-29T11:38:10.000Z | [
"region:us"
] | cinnachromausa | null | null | null | 0 | 0 | <h1><strong><span style="background-color: maroon; color: white;"><a style="background-color: maroon; color: white;" href="https://www.healthsupplement24x7.com/get-cinnachroma">➤➤CinnaChroma – Official Website Link – Click Here</a></span></strong></h1>
<p><strong>➤➤ Product Name - <a href="https://cinnachromareviews.blogspot.com/2023/08/cinnachroma.html">CinnaChroma</a><br /></strong></p>
<p><strong>➤➤ Quantity Per Bottle - 30 Capsules/Jar<br /></strong></p>
<p><strong>➤➤ Category - Blood Sugar Support<br /></strong></p>
<p><strong>➤➤ Compostion - Natural Components Only</strong></p>
<p><strong>➤➤ Results - In Few Days</strong></p>
<p><strong>➤➤ Availability – Official Website <span style="background-color: maroon; color: white;"><a style="background-color: maroon; color: white;" href="https://www.healthsupplement24x7.com/get-cinnachroma">www.CinnaChroma.com</a></span></strong></p>
<p><strong>➤➤ Rating: - 4.8/5.0 ★★★★☆</strong></p>
<h3><span style="background-color: maroon; color: white;"><a style="background-color: maroon; color: white;" href="https://www.healthsupplement24x7.com/get-cinnachroma">✅<strong>Click Here To Visit – OFFICIAL WEBSITE</strong>✅</a></span></h3>
<h3><span style="background-color: maroon; color: white;"><a style="background-color: maroon; color: white;" href="https://www.healthsupplement24x7.com/get-cinnachroma">✅<strong>Click Here To Visit – OFFICIAL WEBSITE</strong>✅</a></span></h3>
<h3><span style="background-color: maroon; color: white;"><a style="background-color: maroon; color: white;" href="https://www.healthsupplement24x7.com/get-cinnachroma">✅<strong>Click Here To Visit – OFFICIAL WEBSITE</strong>✅</a></span></h3>
<p><strong>Short review on <a href="https://cinnachroma-reviews.blogspot.com/2023/08/cinnachroma-blood-sugar-support.html">CinnaChroma</a> :</strong> The prevalence of prediabetes looms large, affecting approximately 84 million individuals, posing a potential risk of progressing into full-fledged diabetes within a mere five-year timeframe. This escalating concern is exacerbated by the alarming statistic that over 114 million Americans stand either at the brink of Type 2 diabetes or are already grappling with its challenges.</p>
<h2 style="text-align: center;"><span style="background-color: white; font-weight: normal;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-cinnachroma"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiAc7OkzGYExa_6hwPzKr3iePyadlQ52jaoAaGRCsrm38czVRK5RchcslOWvZ_GjVuqxRPUdBB1cV807Wx3xMMNSsKGJ2jrN-orhJjAucBY5a_8WnG6L1vXY_0e4nAYJGzyOqU34xQscQTFXT0lwzdgSoaQDNn4SQQVoBCZvOxz9KNEAMCsOrk42n1Rwv_r/w640-h426/6-bottles.jpg" alt="" width="640" height="426" border="0" data-original-height="600" data-original-width="900" /></a></span></h2>
<p><a href="https://sites.google.com/view/cinnachroma-bloodsugar-formula/home">CinnaChroma</a> is a revolutionary dietary supplement formulated by Dr. Scott Saunders and Joe Barton. It is manufactured by Barton Nutrition.The formula used in <a href="https://lookerstudio.google.com/reporting/b40e7013-7cea-4468-9b35-1de94d474e62">CinnaChroma</a> is to support a healthy glucose metabolism to fully balance optimal blood sugar levels and significantly reduce type 2 diabetes.<a href="https://colab.research.google.com/drive/1UtY4GO9PpWQIwfyOUINonJQ947yEbQXz">CinnaChroma</a> is even formulated using ingredients that can help you manage your weight and get rid of fat in a safe and natural process.</p>
<h2 style="text-align: center;"><span style="background-color: maroon; color: white;"><a style="background-color: maroon; color: white;" href="https://www.healthsupplement24x7.com/get-cinnachroma"><strong>➦➦Visit The Official Website To Get CinnaChroma Now!</strong></a></span></h2>
<h2 style="text-align: left;"><strong>What is CinnaChroma?</strong></h2>
<p style="text-align: left;"><a href="https://cinnachroma-updates.clubeo.com/calendar/2023/08/28/cinnachroma-no-1-in-us-forces-to-quickly-decrease-blood-sugar-levels-without-causing-harmful-effects">CinnaChroma</a> emerges as a noteworthy nutritional supplement that holds the promise of facilitating glycemic regulation within the body. The manufacturer's claim asserts that this supplement integrates the well-established properties of cinnamon and chromium, both clinically recognized for their potential to reduce blood sugar levels, into a single, potent formulation. By combining the blood sugar-regulating prowess of these elements, <a href="https://cinnachroma-updates.clubeo.com">CinnaChroma</a> positions itself as a potential metabolic powerhouse harnessed from nature's resources.Central to <a href="https://cinnachroma-updates.clubeo.com/page/cinnachroma-new-updates-powerfully-helps-reduce-a1c-blood-sugar-spikes-in-people-with-type-2.html">CinnaChroma</a>'s composition is its foundation in cinnamon, renowned for its traditional role as a sugar blocker. This formulation, when coupled with the presence of five other specific nutrients scientifically demonstrated to aid in carbohydrate digestion and absorption, amplifies the supplement's potential benefits. This synergy between natural ingredients and contemporary medical insights creates a comprehensive approach to managing carbohydrate intake.</p>
<h2 style="text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-cinnachroma"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgzsIFQUxm0hutZpaJZS_hxGYpf4wZuIRL3-PrAt2xDYzmatt2QqD_xygt7P9xco6Kt31aKAC_38PmG_xEL-yf7x3Fg0MUm671RrGUEt7z3k316ecMOFEHtDkdDYckgmx5zk5Mdiz9Hd4DZNfb6yhYgcFBDpKIouL-Y7clWSw6KdQ2DFKOe2iwQRbnRw42o/w640-h480/CinnaChroma%209.png" alt="" width="640" height="480" border="0" data-original-height="1050" data-original-width="1400" /></a></h2>
<p style="text-align: left;">When adhering to the recommended usage guidelines, <a href="https://huggingface.co/datasets/cinnachromacapsule/cinnachroma/blob/main/README.md">CinnaChroma</a> offers the enticing prospect of indulging in a broader array of foods without harboring concerns about triggering diabetes. Beyond merely facilitating carbohydrate digestion, the unique amalgamation of specific elements within <a href="https://huggingface.co/cinnachromacapsule/cinnachroma/blob/main/README.md">CinnaChroma</a> is designed to optimize sugar profile management. This multifaceted approach addresses not only the digestion of carbs but also the metabolism of sweets and junk foods.</p>
<h2 style="text-align: left;"><strong>Who Created CinnaChroma?</strong></h2>
<p style="text-align: left;">Joe Barton created <a href="https://cinnachromareviews.hashnode.dev/cinnachroma">CinnaChroma</a> from Barton Nutrition and Dr. Scott Saunders. Joe and Dr. Saunders Dr. Saunders say they think the same. They want to help people find alternative medicinal support for their diseases or reduce their risk of developing them. In addition, they say that they want to help as many people as possible by putting their knowledge and studies to work. And for this reason, they spend many nights studying research trials and developing natural formulas that use ingredients from all corners of the world.</p>
<h2 style="text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-cinnachroma"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjtm9ren99MNJag8dxsWebZJcttmmu6KKzDOhIA_k2DwxV5-lhtqzFyq1z2AxL4DKTu4qxcQUh0cfKiXWiajZJ6ZOm914ovfTaCuyrM49eJmohQ2M0a2ln2Bel6griqNv-ELsNuJ3oIpjcKLph09nu8byVB5B5205J6n2BPJ3GUdF4RdMXVMAquKSlQR9Aw/w640-h316/CinnaChroma%206.jpg" alt="" width="640" height="316" border="0" data-original-height="733" data-original-width="1487" /></a></h2>
<h2 style="text-align: left;"><strong>How does CinnaChroma work?</strong></h2>
<p style="text-align: left;"><a href="https://devfolio.co/@cinnachromaus">CinnaChroma</a> operates through a multi-faceted mechanism that leverages the individual and combined properties of its key ingredients, cinnamon and chromium, alongside other supporting nutrients. The supplement's working principle revolves around facilitating glycemic regulation, optimizing sugar profiles, and aiding in the management of carbohydrate intake. Here's a breakdown of how <a href="https://pdfhost.io/v/esXX8lq.P_CinnaChroma_New_Updates_Powerfully_Helps_Reduce_A1C_and_Blood_Sugar_Spikes_in_People_With_Type_2">CinnaChroma</a> works:</p>
<ol style="text-align: left;">
<li><strong>Blood Sugar Regulation with Cinnamon: </strong>Cinnamon, a well-known traditional spice, has been linked to blood sugar regulation. It contains compounds that enhance insulin sensitivity, potentially leading to improved glucose utilization by cells. This can result in more stable blood sugar levels after meals.</li>
<li><strong>Chromium's Role in Insulin Function:</strong> Chromium, an essential trace mineral, plays a role in enhancing insulin's effectiveness in the body. Insulin is a hormone that helps regulate blood sugar levels by facilitating the uptake of glucose into cells for energy.</li>
<li><strong>Carbohydrate Digestion and Absorption: </strong><a href="https://cinnachroma-reviews.company.site/">CinnaChroma</a>'s formulation includes five other nutrients that support the digestion and absorption of carbohydrates. These nutrients can assist the body in efficiently breaking down complex carbohydrates into simpler sugars, allowing for smoother absorption and preventing rapid spikes in blood sugar levels.</li>
<li><strong>Sugar Profile Management: </strong>The supplement's unique blend of ingredients contributes to optimizing sugar profiles. By combining cinnamon's potential to modulate post-meal blood sugar spikes with chromium's role in insulin enhancement, <a href="https://cinnachroma-official.jimdosite.com/">CinnaChroma</a> aims to create a balanced environment for blood sugar management.</li>
<li><strong>Indulgence without Concern: </strong><a href="https://soundcloud.com/cinnachroma-550145605/advance-blood-sugar-supporter-supports-healthy-blood-glicose-metabolism-in-body">CinnaChroma</a>'s comprehensive approach offers users the possibility to consume a wider range of foods, including carbohydrates, sweets, and junk foods, with reduced apprehension about their impact on blood sugar levels.</li>
<li><strong>Quality Manufacturing: </strong>The supplement is manufactured in an FDA-regulated facility in the United States, ensuring that the highest quality and safety standards are maintained during production.</li>
</ol>
<h2 style="text-align: center;"><span style="background-color: maroon; color: white;"><a style="background-color: maroon; color: white;" href="https://www.healthsupplement24x7.com/get-cinnachroma"><strong>➦➦Get your CinnaChroma Here & Get Great Discount!</strong></a></span></h2>
<h2 style="text-align: left;"><strong>Ingredients in CinnaChroma.</strong></h2>
<p style="text-align: left;"><a href="https://www.ivoox.com/cinnachroma-new-updates-is-it-really-works-audios-mp3_rf_115028023_1.html">CinnaChroma</a> is meticulously crafted with a selection of natural vitamins and minerals, designed to seamlessly integrate into diverse lifestyles and wellness routines. Developed in collaboration with Barton Nutrition's esteemed medical and nutrition advisor, Dr. Scott Saunders, <a href="https://cinnachroma-official.bandcamp.com/track/cinnachroma-no-1-in-us-forces-to-quickly-decrease-blood-sugar-levels-without-causing-harmful-effects">CinnaChroma</a> harmonizes the potent properties of cinnamon bark extract with other health-enhancing components to effectively manage glucose levels. Several key ingredients within <a href="https://www.podcasts.com/cinnachroma-official">CinnaChroma</a> contribute to its health-promoting attributes, each offering specific benefits:</p>
<p style="text-align: left;"><strong>Cinnamon Bark : </strong>Cinnamon Bark stands out for its effectiveness in reducing the risk factors associated with diabetes and cardiovascular diseases. A clinical trial published in Diabetes Care in 2003 underscored that cassia cinnamon yields favorable outcomes by decreasing blood glucose and cholesterol levels in individuals with type 2 diabetes. Consumed in moderation, cinnamon bark proves advantageous for overall health.</p>
<p style="text-align: left;"><strong>Chromium : </strong>Chromium supplementation potentially assists diabetics in their quest to lower blood sugar levels. Among various forms of chromium supplements, chromium picolinate emerges as the most effective. While research indicates that chromium can indeed lower glucose levels and enhance insulin sensitivity, it's noteworthy that not all studies have consistently demonstrated this benefit.</p>
<h2 style="text-align: center;"><span style="background-color: white; font-weight: normal;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-cinnachroma"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiEzvUY0YuBwg8fphsSkNIimG9IlSe8YqqlOHkAsij2KUykYTnsHOB_GPv64XNwC5AMHuDUanpfiUlFkWBRLTNfLKgMZxYM03taoyip22W0TJhj5HbGNjTxYpqV5iraWmV16KVhMX2QEQPR8hwD1KIRopD8JyI1Jwuc2wtHF0wF9pRGRUZm3oS4Njk6c4mj/w640-h302/CinnaChroma%204.png" alt="" width="640" height="302" border="0" data-original-height="661" data-original-width="1400" /></a></span></h2>
<p style="text-align: left;"><strong>Vanadium : </strong>Vanadium offers potential relief for diabetic neuropathy and mitigating pain arising from free radical damage. Supported by animal studies and limited human trials, vanadium exhibits the capacity to reduce blood sugar levels and bolster insulin sensitivity in type 2 diabetics. In a specific study involving individuals with type 2 diabetes, vanadium showcased its ability to reduce both total and LDL cholesterol levels.</p>
<p style="text-align: left;"><strong>Selenium : </strong>Selenium, an indispensable trace element, plays a pivotal role in the intricate defense mechanism against oxidative stress. The antioxidant attributes of selenium have the potential to inhibit the progression of diabetes. Existing evidence suggests that maintaining appropriate selenium levels is essential for facilitating insulin secretion.</p>
<p style="text-align: left;"><strong>Vitamin-K2 : </strong>Clinically recognized for its role in blood clotting, Vitamin K also offers intriguing insights into diabetes management. A series of human studies have spotlighted the capacity of vitamin K2 supplementation to improve insulin sensitivity. Furthermore, vitamin K2 supplementation has demonstrated the capability to lower the risk of developing diabetes.</p>
<h2 style="text-align: left;"><strong>Benefits of CinnaChroma .</strong></h2>
<p style="text-align: left;"><a href="https://www.deviantart.com/cinnachromacapsule/art/CinnaChroma-New-Updates-2023-Legit-or-hoax-979599933">CinnaChroma</a> offers a range of potential benefits attributed to its carefully curated blend of natural ingredients. These benefits align with the supplement's goal of promoting glycemic regulation and supporting overall well-being. Here are some key advantages associated with using CinnaChroma:</p>
<ul style="text-align: left;">
<li style="text-align: left;"><strong>Glycemic Regulation:</strong> <a href="https://haitiliberte.com/advert/cinnachroma-new-updates-powerfully-helps-reduce-a1c-blood-sugar-spikes-in-people-with-type-2/">CinnaChroma</a>'s core objective is to facilitate glycemic regulation, helping to maintain stable blood sugar levels. The inclusion of cinnamon and chromium, along with other nutrients, is designed to support this crucial aspect of metabolic health.</li>
<li style="text-align: left;"><strong>Blood Sugar Management: </strong>The supplement's formulation, which includes cinnamon bark extract, chromium, and other nutrients, aims to assist in managing blood sugar levels. This could potentially be valuable for individuals with prediabetes, Type 2 diabetes, or those aiming to prevent such conditions.</li>
<li style="text-align: left;"><strong>Carbohydrate Digestion:</strong> <a href="https://community.weddingwire.in/forum/cinnachroma-price-updates-2023-provides-you-positive-body-mechanism-to-supports-best-blood-sugar-levels--t143426">CinnaChroma</a>'s unique blend of ingredients supports efficient carbohydrate digestion. This can aid in breaking down complex carbohydrates into simpler sugars, potentially preventing rapid spikes in blood sugar levels.</li>
<li style="text-align: left;"><strong>Insulin Sensitivity: </strong>The inclusion of chromium and other components that enhance insulin sensitivity can contribute to more effective utilization of glucose by cells. This, in turn, may help in managing blood sugar levels and reducing the risk of insulin resistance.</li>
</ul>
<p style="text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-cinnachroma"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjuT4uEbltx4lzh1abpuU2FlKnBU6IRjk4qo5-ODHbK9aQvuyFPf4gzpKIvsI8THGhYpWUPxkNV8-IIR6TjqWYUxuWZ8aSrM6uMsCQ1igUXtH1Bz2JyUP0IhChUxzlEhWtz6-1IRapFagQxjP_3ciJ3lDsaWQuaqnnd-uTKFL3d3zhti8VTp7AdDP52EN4l/w640-h640/CinnaChroma%208.png" alt="" width="640" height="640" border="0" data-original-height="1400" data-original-width="1400" /></a></p>
<ul style="text-align: left;">
<li style="text-align: left;"><strong>Metabolic Support: </strong>The synergistic action of the various ingredients in <a href="https://www.weddingwire.com/wedding-forums/cinnachroma-advance-blood-sugar-supporter-supports-healthy-blood-glicose-metabolism-in-body/caeb5a61144907aa.html">CinnaChroma</a> may provide comprehensive metabolic support, contributing to the efficient utilization of nutrients and energy.</li>
<li style="text-align: left;"><strong>Antioxidant Protection: </strong>Certain ingredients, such as selenium, offer antioxidant qualities that can help protect cells from oxidative stress. This protection is particularly valuable in mitigating the risk factors associated with diabetes and related complications.</li>
<li style="text-align: left;"><strong>Potential Cardiovascular Benefits: </strong>The reduction in cholesterol levels associated with certain components, like cinnamon bark extract and vanadium, could contribute to improved cardiovascular health, which is often linked to diabetes management.</li>
<li style="text-align: left;"><strong>Holistic Approach to Diabetes Management: </strong><a href="https://forums.hitched.co.uk/chat/forums/thread/cinnachromaprice-updates-2023-provides-you-positive-body-mechanism-to-supports-best-blood-sugar-levels-1115611/">CinnaChroma</a>'s composition is aimed at tackling various facets of diabetes management, including blood sugar regulation, insulin sensitivity, and carbohydrate metabolism. This holistic approach aligns with the multifaceted nature of diabetes care.</li>
<li style="text-align: left;"><strong>Natural Ingredients: </strong>The reliance on natural ingredients, coupled with the absence of unwanted fillers or byproducts, underlines the supplement's commitment to offering benefits without undesirable side effects.</li>
<li style="text-align: left;"><strong>Expert Collaboration:</strong> The involvement of medical and nutrition experts in the development of <a href="https://medium.com/@cinnachromacapsule/cinnachroma-advance-blood-sugar-supporter-supports-healthy-blood-glicose-metabolism-in-body-4e5583d18a1f">CinnaChroma</a>, including Dr. Scott Saunders from Barton Nutrition, lends credibility to its potential benefits and safety.</li>
</ul>
<h2 style="text-align: center;"><span style="background-color: maroon; color: white;"><a style="background-color: maroon; color: white;" href="https://www.healthsupplement24x7.com/get-cinnachroma"><strong>➦➦Order Here Your CinnaChroma & Grab The Big Discount Right Now!</strong></a></span></h2>
<h2><strong>Pros of CinnaChroma:</strong></h2>
<ul>
<li data-aria-level="1" data-aria-posinset="1" data-font="Calibri" data-leveltext="●" data-list-defn-props="{"335552541":1,"335559684":-2,"335559685":720,"335559991":360,"469769242":[8226],"469777803":"left","469777804":"●","469777815":"multilevel"}" data-listid="4"><strong>The groundbreaking supplement can significantly reduce the risk of obesity, type 2 diabetes, and other heart diseases.</strong></li>
<li data-aria-level="1" data-aria-posinset="2" data-font="Calibri" data-leveltext="●" data-list-defn-props="{"335552541":1,"335559684":-2,"335559685":720,"335559991":360,"469769242":[8226],"469777803":"left","469777804":"●","469777815":"multilevel"}" data-listid="4"><a href="https://sketchfab.com/3d-models/cinnachroma-official-reviews-usa-legit-or-hoax-b30685690aae4f169d0745d9b5ff7646">CinnaChroma</a> can fully regulate your blood sugar levels and also maintain healthy levels of blood pressure and cholesterol.</li>
<li data-aria-level="1" data-aria-posinset="3" data-font="Calibri" data-leveltext="●" data-list-defn-props="{"335552541":1,"335559684":-2,"335559685":720,"335559991":360,"469769242":[8226],"469777803":"left","469777804":"●","469777815":"multilevel"}" data-listid="4"><strong>It can increase your insulin production and sensitivity while decreasing your insulin resistance.</strong></li>
<li data-aria-level="1" data-aria-posinset="4" data-font="Calibri" data-leveltext="●" data-list-defn-props="{"335552541":1,"335559684":-2,"335559685":720,"335559991":360,"469769242":[8226],"469777803":"left","469777804":"●","469777815":"multilevel"}" data-listid="4">The <a href="https://www.remotehub.com/cinnachroma.capsule">CinnaChroma</a> can improve blood circulation throughout the body.</li>
<li data-aria-level="1" data-aria-posinset="5" data-font="Calibri" data-leveltext="●" data-list-defn-props="{"335552541":1,"335559684":-2,"335559685":720,"335559991":360,"469769242":[8226],"469777803":"left","469777804":"●","469777815":"multilevel"}" data-listid="4"><strong>It contains rich amounts of antioxidants that eliminate free radicals, oxidative stress, and other toxic pollutants.</strong></li>
<li data-aria-level="1" data-aria-posinset="5" data-font="Calibri" data-leveltext="●" data-list-defn-props="{"335552541":1,"335559684":-2,"335559685":720,"335559991":360,"469769242":[8226],"469777803":"left","469777804":"●","469777815":"multilevel"}" data-listid="4">CinnaChroma can speed up the anti-inflammatory response of the body.</li>
<li data-aria-level="1" data-aria-posinset="5" data-font="Calibri" data-leveltext="●" data-list-defn-props="{"335552541":1,"335559684":-2,"335559685":720,"335559991":360,"469769242":[8226],"469777803":"left","469777804":"●","469777815":"multilevel"}" data-listid="4"><strong>It can help you lose weight by getting rid of the fat buildup, especially in the stubborn places of your body.</strong></li>
<li data-aria-level="1" data-aria-posinset="5" data-font="Calibri" data-leveltext="●" data-list-defn-props="{"335552541":1,"335559684":-2,"335559685":720,"335559991":360,"469769242":[8226],"469777803":"left","469777804":"●","469777815":"multilevel"}" data-listid="4">CinnaChroma increases and fully supports glucose metabolism.</li>
</ul>
<h2><strong>Recommended dosage of <a href="https://rentry.co/cinnachroma-official">CinnaChroma</a>.<br /></strong></h2>
<p><a href="https://hackmd.io/@cinnachromacapsule/CinnaChroma">CinnaChroma</a> comes in convenient 30-day supply bottles, simplifying adherence to users health routine. To enjoy the benefits, take one capsule daily. It's crucial not to exceed the recommended dosage, as overconsumption could have detrimental health effects. Individuals who have experienced negative reactions to herbal supplements in the past should refrain from taking this supplement.CinnaChroma is not suitable for individuals under the age of 18. Pregnant or nursing women should avoid this supplement, as it may pose risks to their health.</p>
<h2 style="text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-cinnachroma"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj9PKW0WDvK9rIuIq2UyWyDQL2Jw4P5_xaidn5dmXEVGCM8O3f6ENMYPDzOvv9TWCQuWWxaWWi9qvfuJTzgPv3bNn4K2piqghx4_IoLv0xikbahIc-Dl1fAjxnlTx4SmSOuRrSdNWkVMX_Ul4nXIM19VXRP8ODlcuSVstYsUNKhQa-1brpH2oQEjDQlDINT/w640-h250/CinnaChroma%201.jpg" alt="" width="640" height="250" border="0" data-original-height="1062" data-original-width="2727" /></a></h2>
<p>If users are currently taking any over-the-counter medications or have underlying medical conditions, it is advised to abstain from using this supplement.Prior to incorporating this dietary supplement into ones routine, seeking consultation with a healthcare professional is recommended. This precaution ensures that potential adverse reactions are minimized and ones health remains a priority.</p>
<h2><strong>Detail Pricing of <a href="https://sites.google.com/view/cinnachroma-capsules/home">CinnaChroma</a>.</strong></h2>
<h2 style="text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-cinnachroma"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjzqZuIx6TPW-zBi9OO5IgBybBByJ_Ew3uE3e3dh7l6o5eZjbD4KkgAOwW2yHbtWz3O3cR6YUDt5pHzI3lAHCP2ykx0iy9Vfddc1TgP796mqHk-Uv_-H0bbyDW9Vak28RT5vL_uxbkbz7GSqs1d8w0fj85i2EZzVyKGEr4kkCglSTb037IiWjaiCHeoO4Ve/w640-h436/Screenshot%20(534).png" alt="" width="640" height="436" border="0" data-original-height="980" data-original-width="1436" /></a></h2>
<ul>
<li><strong>Buy One Bottle of <a href="https://lookerstudio.google.com/reporting/6feff679-4630-4501-9c02-fc9603100484">CinnaChroma</a> $59/bottle + Small Shipping Fee.</strong></li>
<li><strong> Buy Three Bottles of CinnaChroma $147 [USD 59/bottle] + Free Shipping + Digital Book as Bonus.</strong></li>
<li><strong><span style="background-color: yellow; color: maroon;"> Buy Six Bottles of <a href="https://colab.research.google.com/drive/1BRTbz246J-k6p5vmQOaUlh8rug0mzTln">CinnaChroma</a> $234 [USD 39/bottle] + Free Shipping + Digital Book as Bonus.</span> <span style="color: lime;">✔✔</span><br /></strong></li>
</ul>
<h2 style="text-align: center;"><strong><span style="background-color: maroon; color: white;"><a style="background-color: maroon; color: white;" href="https://www.healthsupplement24x7.com/get-cinnachroma">➦➦Just Click Here To Visit the Official Website & Buy CinnaChroma!</a></span></strong></h2>
<h3><strong>Free Bonus Is:</strong></h3>
<p>The Blood Sugar Solution Kit is Barton Publishing's #1 Top Selling Blood Sugar Support Program.Created under Dr. Scott Saunders, MD's guidance, this system helps balance blood sugar, support A1C, and combat erratic blood sugar's root-causes. With its easy-to-follow simple instructions, this Solution Kit has helped over half a million folks get off the regular “run of the mill” solutions for erratic blood sugar and enjoy stable blood sugar levels.Over half a million amazed users have made The Barton Blood Sugar Solution Kit our #1 most popular product in Barton Publishing's history!And you'll clearly see why since this proven blood sugar support program is broken down into manageable, progressive phases that anyone with high blood sugar can use.</p>
<h2 style="text-align: center;"><span style="background-color: white; font-weight: normal;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-cinnachroma"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhMzqbR_EWu3EXemXqPmaYdHOWojr_ex8p1zOflsn8FdHRijr4rHkylfJKX8KjmGt_VD1WzeyuCyqe6KTaYCt_BGkxKWN0DD5Wl_NE6V9qs5-iusjdYNcuWQ2OMQCtV74Hty6nCvMR7klYrlPOiDCLxZaHdhmtN5tgHt5HCFl2t3XcHQJLKVPk80TAN6r32/w640-h390/Screenshot%20(535).png" alt="" width="640" height="390" border="0" data-original-height="731" data-original-width="1200" /></a></span></h2>
<h2><strong>Money back guarantee on <a href="https://cinnachroma-offers.clubeo.com">CinnaChroma</a>.</strong></h2>
<p>Here's how it works: Go ahead and claim our best deal ever. Take full advantage of our exclusive 3 or 6 month supply bundle, where you can save up to $360 right now. And when you do, you'll have a full 365 days to try it out - risk-free.And if for any reason you don't see or feel the results you demand and deserve, or even if you just don't like the color of the bottle... whatever the reason, simply return it within 365 days for a full refund.</p>
<h2 style="text-align: center;"><span style="background-color: white; font-weight: normal;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-cinnachroma"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgNxiY4KUzaE6B-LB5omteweDWwbuaaTTBQBHHHg6-e-rno_FTsqnvgZS0_cT3sfRaeP1KV6Ze3HvipuQHVdpDwqLlG-jwvKxeMiZJcLRIflJ7mciUjGupEcKGWMOtu8U0IkO_qgetNt5JaNZUgiZ4lGcwqlwMGbjZTi8jg5tlrdQ0pVRvk155ITk8ee4qP/w640-h456/CinnaChroma%2011.png" alt="" width="640" height="456" border="0" data-original-height="422" data-original-width="591" /></a></span></h2>
<h2><strong>Final words on CinnaChroma.</strong></h2>
<p>CinnaChroma is an excellent option for individuals dealing with Type 2 diabetes. It combines a potent blend of compounds that purify the body of harmful toxins and amplify insulin sensitivity. This results in an enhanced supply of oxygen in the bloodstream, leading to heightened vitality throughout the day.Consistent consumption of CinnaChroma prompts the liver to generate insulin more efficiently, aiding in the removal of glucose from the bloodstream. Additionally, it optimizes and elevates the efficiency of the body's metabolic processes. The result is an increase in daytime caloric expenditure, contributing to weight loss and an augmented level of energy.</p>
<h2 style="text-align: center;"><strong><span style="background-color: maroon; color: white;"><a style="background-color: maroon; color: white;" href="https://www.healthsupplement24x7.com/get-cinnachroma">➦➦Visit the Official Website Today and Grab Your Bottle!</a></span></strong></h2>
<p><a href="https://cinnachromareviews.blogspot.com/2023/08/cinnachroma.html">https://cinnachromareviews.blogspot.com/2023/08/cinnachroma.html</a></p>
<p><a href="https://cinnachroma-reviews.blogspot.com/2023/08/cinnachroma-blood-sugar-support.html">https://cinnachroma-reviews.blogspot.com/2023/08/cinnachroma-blood-sugar-support.html</a></p>
<p><a href="https://sites.google.com/view/cinnachroma-bloodsugar-formula/home">https://sites.google.com/view/cinnachroma-bloodsugar-formula/home</a></p>
<p><a href="https://lookerstudio.google.com/reporting/b40e7013-7cea-4468-9b35-1de94d474e62">https://lookerstudio.google.com/reporting/b40e7013-7cea-4468-9b35-1de94d474e62</a></p>
<p><a href="https://colab.research.google.com/drive/1UtY4GO9PpWQIwfyOUINonJQ947yEbQXz">https://colab.research.google.com/drive/1UtY4GO9PpWQIwfyOUINonJQ947yEbQXz</a></p>
<p><a href="https://cinnachroma-updates.clubeo.com/calendar/2023/08/28/cinnachroma-no-1-in-us-forces-to-quickly-decrease-blood-sugar-levels-without-causing-harmful-effects">https://cinnachroma-updates.clubeo.com/calendar/2023/08/28/cinnachroma-no-1-in-us-forces-to-quickly-decrease-blood-sugar-levels-without-causing-harmful-effects</a></p>
<p><a href="https://cinnachroma-updates.clubeo.com">https://cinnachroma-updates.clubeo.com</a></p>
<p><a href="https://huggingface.co/datasets/cinnachromacapsule/cinnachroma/blob/main/README.md">https://huggingface.co/datasets/cinnachromacapsule/cinnachroma/blob/main/README.md</a></p>
<p><a href="https://cinnachromareviews.hashnode.dev/cinnachroma">https://cinnachromareviews.hashnode.dev/cinnachroma</a></p>
<p><a href="https://devfolio.co/@cinnachromaus">https://devfolio.co/@cinnachromaus</a></p>
<p><a href="https://pdfhost.io/v/esXX8lq.P_CinnaChroma_New_Updates_Powerfully_Helps_Reduce_A1C_and_Blood_Sugar_Spikes_in_People_With_Type_2">https://pdfhost.io/v/esXX8lq.P_CinnaChroma_New_Updates_Powerfully_Helps_Reduce_A1C_and_Blood_Sugar_Spikes_in_People_With_Type_2</a></p>
<p><a href="https://cinnachroma-reviews.company.site/">https://cinnachroma-reviews.company.site/</a></p>
<p><a href="https://cinnachroma-official.jimdosite.com/">https://cinnachroma-official.jimdosite.com/</a></p>
<p><a href="https://soundcloud.com/cinnachroma-550145605/advance-blood-sugar-supporter-supports-healthy-blood-glicose-metabolism-in-body">https://soundcloud.com/cinnachroma-550145605/advance-blood-sugar-supporter-supports-healthy-blood-glicose-metabolism-in-body</a></p>
<p><a href="https://www.ivoox.com/cinnachroma-new-updates-is-it-really-works-audios-mp3_rf_115028023_1.html">https://www.ivoox.com/cinnachroma-new-updates-is-it-really-works-audios-mp3_rf_115028023_1.html</a></p>
<p><a href="https://cinnachroma-official.bandcamp.com/track/cinnachroma-no-1-in-us-forces-to-quickly-decrease-blood-sugar-levels-without-causing-harmful-effects">https://cinnachroma-official.bandcamp.com/track/cinnachroma-no-1-in-us-forces-to-quickly-decrease-blood-sugar-levels-without-causing-harmful-effects</a></p>
<div id="simple-translate" class="simple-translate-system-theme"> </div> |
open-llm-leaderboard/details_synapsoft__Llama-2-7b-hf-flan2022-1.2M | 2023-08-29T11:40:02.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of synapsoft/Llama-2-7b-hf-flan2022-1.2M
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [synapsoft/Llama-2-7b-hf-flan2022-1.2M](https://huggingface.co/synapsoft/Llama-2-7b-hf-flan2022-1.2M)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_synapsoft__Llama-2-7b-hf-flan2022-1.2M\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-29T11:38:40.621041](https://huggingface.co/datasets/open-llm-leaderboard/details_synapsoft__Llama-2-7b-hf-flan2022-1.2M/blob/main/results_2023-08-29T11%3A38%3A40.621041.json):\n\
\n```python\n{\n \"all\": {\n \"acc\": 0.42282294759517697,\n \"\
acc_stderr\": 0.034708188469939394,\n \"acc_norm\": 0.42623596296245025,\n\
\ \"acc_norm_stderr\": 0.03469491663340893,\n \"mc1\": 0.2582619339045288,\n\
\ \"mc1_stderr\": 0.015321821688476199,\n \"mc2\": 0.37973340895125,\n\
\ \"mc2_stderr\": 0.013708193792690383\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.23122866894197952,\n \"acc_stderr\": 0.012320858834772254,\n\
\ \"acc_norm\": 0.23293515358361774,\n \"acc_norm_stderr\": 0.01235250704261741\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5849432383987253,\n\
\ \"acc_stderr\": 0.004917248150601852,\n \"acc_norm\": 0.7846046604262099,\n\
\ \"acc_norm_stderr\": 0.004102561587459201\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n\
\ \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n\
\ \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3815789473684211,\n \"acc_stderr\": 0.03953173377749194,\n\
\ \"acc_norm\": 0.3815789473684211,\n \"acc_norm_stderr\": 0.03953173377749194\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.46037735849056605,\n \"acc_stderr\": 0.030676096599389184,\n\
\ \"acc_norm\": 0.46037735849056605,\n \"acc_norm_stderr\": 0.030676096599389184\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4027777777777778,\n\
\ \"acc_stderr\": 0.04101405519842425,\n \"acc_norm\": 0.4027777777777778,\n\
\ \"acc_norm_stderr\": 0.04101405519842425\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\"\
: 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3872832369942196,\n\
\ \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.3872832369942196,\n\
\ \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.1568627450980392,\n \"acc_stderr\": 0.036186648199362445,\n\
\ \"acc_norm\": 0.1568627450980392,\n \"acc_norm_stderr\": 0.036186648199362445\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n\
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.41702127659574467,\n \"acc_stderr\": 0.03223276266711712,\n\
\ \"acc_norm\": 0.41702127659574467,\n \"acc_norm_stderr\": 0.03223276266711712\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.041424397194893624,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.041424397194893624\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3586206896551724,\n \"acc_stderr\": 0.039966295748767186,\n\
\ \"acc_norm\": 0.3586206896551724,\n \"acc_norm_stderr\": 0.039966295748767186\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.291005291005291,\n \"acc_stderr\": 0.023393826500484875,\n \"\
acc_norm\": 0.291005291005291,\n \"acc_norm_stderr\": 0.023393826500484875\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.04006168083848876,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.04006168083848876\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4806451612903226,\n\
\ \"acc_stderr\": 0.028422687404312107,\n \"acc_norm\": 0.4806451612903226,\n\
\ \"acc_norm_stderr\": 0.028422687404312107\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.33004926108374383,\n \"acc_stderr\": 0.033085304262282574,\n\
\ \"acc_norm\": 0.33004926108374383,\n \"acc_norm_stderr\": 0.033085304262282574\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.3393939393939394,\n \"acc_stderr\": 0.036974422050315967,\n\
\ \"acc_norm\": 0.3393939393939394,\n \"acc_norm_stderr\": 0.036974422050315967\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.03540294377095368,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03540294377095368\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6476683937823834,\n \"acc_stderr\": 0.03447478286414358,\n\
\ \"acc_norm\": 0.6476683937823834,\n \"acc_norm_stderr\": 0.03447478286414358\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4307692307692308,\n \"acc_stderr\": 0.02510682066053975,\n \
\ \"acc_norm\": 0.4307692307692308,\n \"acc_norm_stderr\": 0.02510682066053975\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \
\ \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3739495798319328,\n \"acc_stderr\": 0.031429466378837076,\n\
\ \"acc_norm\": 0.3739495798319328,\n \"acc_norm_stderr\": 0.031429466378837076\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.037101857261199966,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.037101857261199966\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5853211009174312,\n \"acc_stderr\": 0.021122903208602592,\n \"\
acc_norm\": 0.5853211009174312,\n \"acc_norm_stderr\": 0.021122903208602592\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2037037037037037,\n \"acc_stderr\": 0.02746740180405801,\n \"\
acc_norm\": 0.2037037037037037,\n \"acc_norm_stderr\": 0.02746740180405801\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5343137254901961,\n \"acc_stderr\": 0.03501038327635896,\n \"\
acc_norm\": 0.5343137254901961,\n \"acc_norm_stderr\": 0.03501038327635896\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5569620253164557,\n \"acc_stderr\": 0.032335327775334835,\n \
\ \"acc_norm\": 0.5569620253164557,\n \"acc_norm_stderr\": 0.032335327775334835\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5246636771300448,\n\
\ \"acc_stderr\": 0.03351695167652628,\n \"acc_norm\": 0.5246636771300448,\n\
\ \"acc_norm_stderr\": 0.03351695167652628\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4732824427480916,\n \"acc_stderr\": 0.04379024936553894,\n\
\ \"acc_norm\": 0.4732824427480916,\n \"acc_norm_stderr\": 0.04379024936553894\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5950413223140496,\n \"acc_stderr\": 0.04481137755942469,\n \"\
acc_norm\": 0.5950413223140496,\n \"acc_norm_stderr\": 0.04481137755942469\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4351851851851852,\n\
\ \"acc_stderr\": 0.04792898170907062,\n \"acc_norm\": 0.4351851851851852,\n\
\ \"acc_norm_stderr\": 0.04792898170907062\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4294478527607362,\n \"acc_stderr\": 0.038890666191127216,\n\
\ \"acc_norm\": 0.4294478527607362,\n \"acc_norm_stderr\": 0.038890666191127216\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.42718446601941745,\n \"acc_stderr\": 0.04897957737781168,\n\
\ \"acc_norm\": 0.42718446601941745,\n \"acc_norm_stderr\": 0.04897957737781168\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6794871794871795,\n\
\ \"acc_stderr\": 0.030572811310299607,\n \"acc_norm\": 0.6794871794871795,\n\
\ \"acc_norm_stderr\": 0.030572811310299607\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956913,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956913\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6130268199233716,\n\
\ \"acc_stderr\": 0.017417138059440132,\n \"acc_norm\": 0.6130268199233716,\n\
\ \"acc_norm_stderr\": 0.017417138059440132\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4277456647398844,\n \"acc_stderr\": 0.026636539741116082,\n\
\ \"acc_norm\": 0.4277456647398844,\n \"acc_norm_stderr\": 0.026636539741116082\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n\
\ \"acc_stderr\": 0.014893391735249612,\n \"acc_norm\": 0.27262569832402234,\n\
\ \"acc_norm_stderr\": 0.014893391735249612\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.02845263998508801,\n\
\ \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.02845263998508801\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5562700964630225,\n\
\ \"acc_stderr\": 0.02821768355665231,\n \"acc_norm\": 0.5562700964630225,\n\
\ \"acc_norm_stderr\": 0.02821768355665231\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.027815973433878014,\n\
\ \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.027815973433878014\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.39361702127659576,\n \"acc_stderr\": 0.029144544781596157,\n \
\ \"acc_norm\": 0.39361702127659576,\n \"acc_norm_stderr\": 0.029144544781596157\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.31029986962190353,\n\
\ \"acc_stderr\": 0.011815439293469832,\n \"acc_norm\": 0.31029986962190353,\n\
\ \"acc_norm_stderr\": 0.011815439293469832\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.49264705882352944,\n \"acc_stderr\": 0.030369552523902173,\n\
\ \"acc_norm\": 0.49264705882352944,\n \"acc_norm_stderr\": 0.030369552523902173\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.46568627450980393,\n \"acc_stderr\": 0.02018014484330729,\n \
\ \"acc_norm\": 0.46568627450980393,\n \"acc_norm_stderr\": 0.02018014484330729\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4909090909090909,\n\
\ \"acc_stderr\": 0.04788339768702861,\n \"acc_norm\": 0.4909090909090909,\n\
\ \"acc_norm_stderr\": 0.04788339768702861\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5061224489795918,\n \"acc_stderr\": 0.03200682020163907,\n\
\ \"acc_norm\": 0.5061224489795918,\n \"acc_norm_stderr\": 0.03200682020163907\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5223880597014925,\n\
\ \"acc_stderr\": 0.03531987930208731,\n \"acc_norm\": 0.5223880597014925,\n\
\ \"acc_norm_stderr\": 0.03531987930208731\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3674698795180723,\n\
\ \"acc_stderr\": 0.03753267402120574,\n \"acc_norm\": 0.3674698795180723,\n\
\ \"acc_norm_stderr\": 0.03753267402120574\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6608187134502924,\n \"acc_stderr\": 0.03631053496488904,\n\
\ \"acc_norm\": 0.6608187134502924,\n \"acc_norm_stderr\": 0.03631053496488904\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2582619339045288,\n\
\ \"mc1_stderr\": 0.015321821688476199,\n \"mc2\": 0.37973340895125,\n\
\ \"mc2_stderr\": 0.013708193792690383\n }\n}\n```"
repo_url: https://huggingface.co/synapsoft/Llama-2-7b-hf-flan2022-1.2M
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|arc:challenge|25_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hellaswag|10_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T11:38:40.621041.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T11:38:40.621041.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T11:38:40.621041.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T11:38:40.621041.parquet'
- config_name: results
data_files:
- split: 2023_08_29T11_38_40.621041
path:
- results_2023-08-29T11:38:40.621041.parquet
- split: latest
path:
- results_2023-08-29T11:38:40.621041.parquet
---
# Dataset Card for Evaluation run of synapsoft/Llama-2-7b-hf-flan2022-1.2M
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/synapsoft/Llama-2-7b-hf-flan2022-1.2M
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [synapsoft/Llama-2-7b-hf-flan2022-1.2M](https://huggingface.co/synapsoft/Llama-2-7b-hf-flan2022-1.2M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_synapsoft__Llama-2-7b-hf-flan2022-1.2M",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-29T11:38:40.621041](https://huggingface.co/datasets/open-llm-leaderboard/details_synapsoft__Llama-2-7b-hf-flan2022-1.2M/blob/main/results_2023-08-29T11%3A38%3A40.621041.json):
```python
{
"all": {
"acc": 0.42282294759517697,
"acc_stderr": 0.034708188469939394,
"acc_norm": 0.42623596296245025,
"acc_norm_stderr": 0.03469491663340893,
"mc1": 0.2582619339045288,
"mc1_stderr": 0.015321821688476199,
"mc2": 0.37973340895125,
"mc2_stderr": 0.013708193792690383
},
"harness|arc:challenge|25": {
"acc": 0.23122866894197952,
"acc_stderr": 0.012320858834772254,
"acc_norm": 0.23293515358361774,
"acc_norm_stderr": 0.01235250704261741
},
"harness|hellaswag|10": {
"acc": 0.5849432383987253,
"acc_stderr": 0.004917248150601852,
"acc_norm": 0.7846046604262099,
"acc_norm_stderr": 0.004102561587459201
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.04284958639753399,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.04284958639753399
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3815789473684211,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.3815789473684211,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.46037735849056605,
"acc_stderr": 0.030676096599389184,
"acc_norm": 0.46037735849056605,
"acc_norm_stderr": 0.030676096599389184
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4027777777777778,
"acc_stderr": 0.04101405519842425,
"acc_norm": 0.4027777777777778,
"acc_norm_stderr": 0.04101405519842425
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3872832369942196,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.3872832369942196,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.1568627450980392,
"acc_stderr": 0.036186648199362445,
"acc_norm": 0.1568627450980392,
"acc_norm_stderr": 0.036186648199362445
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.41702127659574467,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.41702127659574467,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.041424397194893624,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.041424397194893624
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3586206896551724,
"acc_stderr": 0.039966295748767186,
"acc_norm": 0.3586206896551724,
"acc_norm_stderr": 0.039966295748767186
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.291005291005291,
"acc_stderr": 0.023393826500484875,
"acc_norm": 0.291005291005291,
"acc_norm_stderr": 0.023393826500484875
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04006168083848876,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04006168083848876
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4806451612903226,
"acc_stderr": 0.028422687404312107,
"acc_norm": 0.4806451612903226,
"acc_norm_stderr": 0.028422687404312107
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.33004926108374383,
"acc_stderr": 0.033085304262282574,
"acc_norm": 0.33004926108374383,
"acc_norm_stderr": 0.033085304262282574
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3393939393939394,
"acc_stderr": 0.036974422050315967,
"acc_norm": 0.3393939393939394,
"acc_norm_stderr": 0.036974422050315967
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03540294377095368,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03540294377095368
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6476683937823834,
"acc_stderr": 0.03447478286414358,
"acc_norm": 0.6476683937823834,
"acc_norm_stderr": 0.03447478286414358
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4307692307692308,
"acc_stderr": 0.02510682066053975,
"acc_norm": 0.4307692307692308,
"acc_norm_stderr": 0.02510682066053975
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.0263357394040558,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.0263357394040558
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3739495798319328,
"acc_stderr": 0.031429466378837076,
"acc_norm": 0.3739495798319328,
"acc_norm_stderr": 0.031429466378837076
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.037101857261199966,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.037101857261199966
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5853211009174312,
"acc_stderr": 0.021122903208602592,
"acc_norm": 0.5853211009174312,
"acc_norm_stderr": 0.021122903208602592
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2037037037037037,
"acc_stderr": 0.02746740180405801,
"acc_norm": 0.2037037037037037,
"acc_norm_stderr": 0.02746740180405801
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5343137254901961,
"acc_stderr": 0.03501038327635896,
"acc_norm": 0.5343137254901961,
"acc_norm_stderr": 0.03501038327635896
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5569620253164557,
"acc_stderr": 0.032335327775334835,
"acc_norm": 0.5569620253164557,
"acc_norm_stderr": 0.032335327775334835
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5246636771300448,
"acc_stderr": 0.03351695167652628,
"acc_norm": 0.5246636771300448,
"acc_norm_stderr": 0.03351695167652628
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4732824427480916,
"acc_stderr": 0.04379024936553894,
"acc_norm": 0.4732824427480916,
"acc_norm_stderr": 0.04379024936553894
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5950413223140496,
"acc_stderr": 0.04481137755942469,
"acc_norm": 0.5950413223140496,
"acc_norm_stderr": 0.04481137755942469
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.04792898170907062,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.04792898170907062
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4294478527607362,
"acc_stderr": 0.038890666191127216,
"acc_norm": 0.4294478527607362,
"acc_norm_stderr": 0.038890666191127216
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.42718446601941745,
"acc_stderr": 0.04897957737781168,
"acc_norm": 0.42718446601941745,
"acc_norm_stderr": 0.04897957737781168
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6794871794871795,
"acc_stderr": 0.030572811310299607,
"acc_norm": 0.6794871794871795,
"acc_norm_stderr": 0.030572811310299607
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6130268199233716,
"acc_stderr": 0.017417138059440132,
"acc_norm": 0.6130268199233716,
"acc_norm_stderr": 0.017417138059440132
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4277456647398844,
"acc_stderr": 0.026636539741116082,
"acc_norm": 0.4277456647398844,
"acc_norm_stderr": 0.026636539741116082
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27262569832402234,
"acc_stderr": 0.014893391735249612,
"acc_norm": 0.27262569832402234,
"acc_norm_stderr": 0.014893391735249612
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.02845263998508801,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.02845263998508801
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5562700964630225,
"acc_stderr": 0.02821768355665231,
"acc_norm": 0.5562700964630225,
"acc_norm_stderr": 0.02821768355665231
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.027815973433878014,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.027815973433878014
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.39361702127659576,
"acc_stderr": 0.029144544781596157,
"acc_norm": 0.39361702127659576,
"acc_norm_stderr": 0.029144544781596157
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.31029986962190353,
"acc_stderr": 0.011815439293469832,
"acc_norm": 0.31029986962190353,
"acc_norm_stderr": 0.011815439293469832
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.49264705882352944,
"acc_stderr": 0.030369552523902173,
"acc_norm": 0.49264705882352944,
"acc_norm_stderr": 0.030369552523902173
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.46568627450980393,
"acc_stderr": 0.02018014484330729,
"acc_norm": 0.46568627450980393,
"acc_norm_stderr": 0.02018014484330729
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4909090909090909,
"acc_stderr": 0.04788339768702861,
"acc_norm": 0.4909090909090909,
"acc_norm_stderr": 0.04788339768702861
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5061224489795918,
"acc_stderr": 0.03200682020163907,
"acc_norm": 0.5061224489795918,
"acc_norm_stderr": 0.03200682020163907
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5223880597014925,
"acc_stderr": 0.03531987930208731,
"acc_norm": 0.5223880597014925,
"acc_norm_stderr": 0.03531987930208731
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3674698795180723,
"acc_stderr": 0.03753267402120574,
"acc_norm": 0.3674698795180723,
"acc_norm_stderr": 0.03753267402120574
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6608187134502924,
"acc_stderr": 0.03631053496488904,
"acc_norm": 0.6608187134502924,
"acc_norm_stderr": 0.03631053496488904
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2582619339045288,
"mc1_stderr": 0.015321821688476199,
"mc2": 0.37973340895125,
"mc2_stderr": 0.013708193792690383
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
amazon-sagemaker/repository-metadata | 2023-10-09T19:09:25.000Z | [
"region:us"
] | amazon-sagemaker | null | null | null | 1 | 0 | Entry not found |
yzhuang/autotree_automl_MagicTelescope_gosdt_l512_d3_sd3 | 2023-08-29T11:44:33.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 6767200000
num_examples: 100000
- name: validation
num_bytes: 676720000
num_examples: 10000
download_size: 2606790213
dataset_size: 7443920000
---
# Dataset Card for "autotree_automl_MagicTelescope_gosdt_l512_d3_sd3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
marasama/nva-lyla_lightsworn_sorceress | 2023-08-29T11:46:52.000Z | [
"region:us"
] | marasama | null | null | null | 0 | 0 | Entry not found |
SQWMEOOOW/zhxygyb | 2023-08-29T11:54:14.000Z | [
"region:us"
] | SQWMEOOOW | null | null | null | 0 | 0 | Entry not found |
Thaianh/Swinburne | 2023-08-29T12:00:21.000Z | [
"region:us"
] | Thaianh | null | null | null | 0 | 0 | Entry not found |
OpenAssistant/OASST-DE | 2023-09-28T08:26:17.000Z | [
"size_categories:1K<n<10K",
"language:de",
"license:apache-2.0",
"arxiv:2304.07327",
"region:us"
] | OpenAssistant | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: conversation
list:
- name: role
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 8022604.792326268
num_examples: 3721
download_size: 4325950
dataset_size: 8022604.792326268
license: apache-2.0
language:
- de
size_categories:
- 1K<n<10K
---
# German OpenAssistant Conversations Dataset (OASST-DE)
With the goal of advancing open-source, german-language LLM research, we present
OASST-DE: a high quality subset of a recent (25.08.23) dump from the [OpenAssistant website](https://www.open-assistant.io/)
translated to German using the GPT-3.5 API. More details on how the dataset was filtered and translated under [dataset creation.](#dataset-creation-process)
For more details on the OpenAssistant Project, look at the [first OASST dataset (OASST1)](https://huggingface.co/datasets/OpenAssistant/oasst1), [the Open-Assistant GitHub repo](https://github.com/LAION-AI/Open-Assistant)
or [our paper](https://arxiv.org/abs/2304.07327).
This dataset was created as part of LAION's LeoLM (Linguistically Enhanced Open Language Model) project led by Björn Plüster.
Check out LeoLM-Chat trained with OASST-DE ([7b](https://huggingface.co/LeoLM/leo-hessianai-7b-chat), [13b](https://huggingface.co/LeoLM/leo-hessianai-13b-chat)) finetuned on OASST-DE and read [their blog post](https://laion.ai/blog/leo-lm/)) for more info on LeoLM.
## Dataset Creation Process
This dataset was created from a recent OASST dump by following these steps:
- Filter for Top1 response trees with assistant response leaves
- Filter first prompt quality >= 0.5
- Filter total conversation length < 1900 tokens to fit in GPT3.5 context length
- Filter for `'lang' == 'de'` -> add to dataset
- Filter for `'lang' == 'en'` (other languages often result in failed translations)
- Translate using GPT-3.5-turbo API (total cost ~15$).
This results in around 3.7k samples of high-quality assistant conversations.
## Dataset Structure
This dataset has only one `'conversation'` field. Each example is a list of an alternating conversation between `'prompter'` and `'assistant'`,
where each entry is a dict with `'text'` and `'role'` fields:
```json
"conversation": [
{"role": "prompter", "text": "Moin, wie geht's dir?"},
{"role": "assistant", "text": "Moin Moin! Mir geht es gut, und dir?"},
...
]
```
## Usage with 🤗Datasets:
```python
from datasets import load_dataset
ds = load_dataset("OpenAssistant/oasst_de", split="train")
print(ds[0]["conversation"])
``` |
HydraLM/CoT-Collection-standardized | 2023-08-30T20:37:21.000Z | [
"region:us"
] | HydraLM | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
splits:
- name: train
num_bytes: 2149718484
num_examples: 3675842
download_size: 1206341432
dataset_size: 2149718484
---
# Dataset Card for "CoT-Collection-standardized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
malteee/SynTruckS | 2023-08-29T12:07:56.000Z | [
"region:us"
] | malteee | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_TheBloke__Llama-2-7B-GPTQ | 2023-08-30T09:34:20.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TheBloke/Llama-2-7B-GPTQ
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/Llama-2-7B-GPTQ](https://huggingface.co/TheBloke/Llama-2-7B-GPTQ) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__Llama-2-7B-GPTQ\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-30T09:33:50.119005](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Llama-2-7B-GPTQ/blob/main/results_2023-08-30T09%3A33%3A50.119005.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.44282708077943145,\n\
\ \"acc_stderr\": 0.03524402594377024,\n \"acc_norm\": 0.44692317330927667,\n\
\ \"acc_norm_stderr\": 0.03523098278385255,\n \"mc1\": 0.2484700122399021,\n\
\ \"mc1_stderr\": 0.015127427096520697,\n \"mc2\": 0.39318347243467555,\n\
\ \"mc2_stderr\": 0.013670242009997141\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4786689419795222,\n \"acc_stderr\": 0.014598087973127108,\n\
\ \"acc_norm\": 0.5204778156996587,\n \"acc_norm_stderr\": 0.014599131353035012\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5760804620593507,\n\
\ \"acc_stderr\": 0.004931679059919375,\n \"acc_norm\": 0.7759410476000796,\n\
\ \"acc_norm_stderr\": 0.004161089244867776\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\
\ \"acc_stderr\": 0.04304979692464242,\n \"acc_norm\": 0.45925925925925926,\n\
\ \"acc_norm_stderr\": 0.04304979692464242\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.39473684210526316,\n \"acc_stderr\": 0.039777499346220734,\n\
\ \"acc_norm\": 0.39473684210526316,\n \"acc_norm_stderr\": 0.039777499346220734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.46,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4188679245283019,\n \"acc_stderr\": 0.030365050829115208,\n\
\ \"acc_norm\": 0.4188679245283019,\n \"acc_norm_stderr\": 0.030365050829115208\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4097222222222222,\n\
\ \"acc_stderr\": 0.04112490974670787,\n \"acc_norm\": 0.4097222222222222,\n\
\ \"acc_norm_stderr\": 0.04112490974670787\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816508,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816508\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n\
\ \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952344,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952344\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.37572254335260113,\n\
\ \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.37572254335260113,\n\
\ \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617746,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617746\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4085106382978723,\n \"acc_stderr\": 0.03213418026701576,\n\
\ \"acc_norm\": 0.4085106382978723,\n \"acc_norm_stderr\": 0.03213418026701576\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n\
\ \"acc_stderr\": 0.04404556157374768,\n \"acc_norm\": 0.32456140350877194,\n\
\ \"acc_norm_stderr\": 0.04404556157374768\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.041546596717075474,\n\
\ \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.041546596717075474\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24338624338624337,\n \"acc_stderr\": 0.022101128787415433,\n \"\
acc_norm\": 0.24338624338624337,\n \"acc_norm_stderr\": 0.022101128787415433\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n\
\ \"acc_stderr\": 0.04240799327574924,\n \"acc_norm\": 0.3412698412698413,\n\
\ \"acc_norm_stderr\": 0.04240799327574924\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.45483870967741935,\n\
\ \"acc_stderr\": 0.02832774309156107,\n \"acc_norm\": 0.45483870967741935,\n\
\ \"acc_norm_stderr\": 0.02832774309156107\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3054187192118227,\n \"acc_stderr\": 0.03240661565868408,\n\
\ \"acc_norm\": 0.3054187192118227,\n \"acc_norm_stderr\": 0.03240661565868408\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5393939393939394,\n \"acc_stderr\": 0.03892207016552013,\n\
\ \"acc_norm\": 0.5393939393939394,\n \"acc_norm_stderr\": 0.03892207016552013\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4797979797979798,\n \"acc_stderr\": 0.035594435655639196,\n \"\
acc_norm\": 0.4797979797979798,\n \"acc_norm_stderr\": 0.035594435655639196\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6476683937823834,\n \"acc_stderr\": 0.03447478286414357,\n\
\ \"acc_norm\": 0.6476683937823834,\n \"acc_norm_stderr\": 0.03447478286414357\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.37948717948717947,\n \"acc_stderr\": 0.02460362692409742,\n\
\ \"acc_norm\": 0.37948717948717947,\n \"acc_norm_stderr\": 0.02460362692409742\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.29259259259259257,\n \"acc_stderr\": 0.02773896963217609,\n \
\ \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.02773896963217609\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.031566630992154156,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.031566630992154156\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5834862385321101,\n \"acc_stderr\": 0.021136376504030874,\n \"\
acc_norm\": 0.5834862385321101,\n \"acc_norm_stderr\": 0.021136376504030874\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2222222222222222,\n \"acc_stderr\": 0.02835321286686342,\n \"\
acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.02835321286686342\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.49019607843137253,\n \"acc_stderr\": 0.03508637358630572,\n \"\
acc_norm\": 0.49019607843137253,\n \"acc_norm_stderr\": 0.03508637358630572\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5021097046413502,\n \"acc_stderr\": 0.032546938018020076,\n \
\ \"acc_norm\": 0.5021097046413502,\n \"acc_norm_stderr\": 0.032546938018020076\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5291479820627802,\n\
\ \"acc_stderr\": 0.03350073248773404,\n \"acc_norm\": 0.5291479820627802,\n\
\ \"acc_norm_stderr\": 0.03350073248773404\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5114503816793893,\n \"acc_stderr\": 0.04384140024078016,\n\
\ \"acc_norm\": 0.5114503816793893,\n \"acc_norm_stderr\": 0.04384140024078016\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6115702479338843,\n \"acc_stderr\": 0.044492703500683836,\n \"\
acc_norm\": 0.6115702479338843,\n \"acc_norm_stderr\": 0.044492703500683836\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.49074074074074076,\n\
\ \"acc_stderr\": 0.04832853553437055,\n \"acc_norm\": 0.49074074074074076,\n\
\ \"acc_norm_stderr\": 0.04832853553437055\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.49079754601226994,\n \"acc_stderr\": 0.039277056007874414,\n\
\ \"acc_norm\": 0.49079754601226994,\n \"acc_norm_stderr\": 0.039277056007874414\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.4854368932038835,\n \"acc_stderr\": 0.049486373240266376,\n\
\ \"acc_norm\": 0.4854368932038835,\n \"acc_norm_stderr\": 0.049486373240266376\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.030882736974138666,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.030882736974138666\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5810983397190294,\n\
\ \"acc_stderr\": 0.017643205052377188,\n \"acc_norm\": 0.5810983397190294,\n\
\ \"acc_norm_stderr\": 0.017643205052377188\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.49710982658959535,\n \"acc_stderr\": 0.02691864538323901,\n\
\ \"acc_norm\": 0.49710982658959535,\n \"acc_norm_stderr\": 0.02691864538323901\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n\
\ \"acc_stderr\": 0.014265554192331144,\n \"acc_norm\": 0.23910614525139665,\n\
\ \"acc_norm_stderr\": 0.014265554192331144\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.02858034106513829,\n\
\ \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.02858034106513829\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5659163987138264,\n\
\ \"acc_stderr\": 0.0281502322445356,\n \"acc_norm\": 0.5659163987138264,\n\
\ \"acc_norm_stderr\": 0.0281502322445356\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.49074074074074076,\n \"acc_stderr\": 0.027815973433878014,\n\
\ \"acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.027815973433878014\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.32978723404255317,\n \"acc_stderr\": 0.028045946942042405,\n \
\ \"acc_norm\": 0.32978723404255317,\n \"acc_norm_stderr\": 0.028045946942042405\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3578878748370274,\n\
\ \"acc_stderr\": 0.012243563850490309,\n \"acc_norm\": 0.3578878748370274,\n\
\ \"acc_norm_stderr\": 0.012243563850490309\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4264705882352941,\n \"acc_stderr\": 0.030042615832714874,\n\
\ \"acc_norm\": 0.4264705882352941,\n \"acc_norm_stderr\": 0.030042615832714874\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4411764705882353,\n \"acc_stderr\": 0.020087362076702857,\n \
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.020087362076702857\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04789131426105757,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04789131426105757\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.49387755102040815,\n \"acc_stderr\": 0.03200682020163908,\n\
\ \"acc_norm\": 0.49387755102040815,\n \"acc_norm_stderr\": 0.03200682020163908\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6069651741293532,\n\
\ \"acc_stderr\": 0.0345368246603156,\n \"acc_norm\": 0.6069651741293532,\n\
\ \"acc_norm_stderr\": 0.0345368246603156\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n\
\ \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n\
\ \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6783625730994152,\n \"acc_stderr\": 0.03582529442573122,\n\
\ \"acc_norm\": 0.6783625730994152,\n \"acc_norm_stderr\": 0.03582529442573122\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2484700122399021,\n\
\ \"mc1_stderr\": 0.015127427096520697,\n \"mc2\": 0.39318347243467555,\n\
\ \"mc2_stderr\": 0.013670242009997141\n }\n}\n```"
repo_url: https://huggingface.co/TheBloke/Llama-2-7B-GPTQ
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|arc:challenge|25_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|arc:challenge|25_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hellaswag|10_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hellaswag|10_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T12:13:30.420278.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T09:33:50.119005.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T09:33:50.119005.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T12:13:30.420278.parquet'
- split: 2023_08_30T09_33_50.119005
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T09:33:50.119005.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T09:33:50.119005.parquet'
- config_name: results
data_files:
- split: 2023_08_29T12_13_30.420278
path:
- results_2023-08-29T12:13:30.420278.parquet
- split: 2023_08_30T09_33_50.119005
path:
- results_2023-08-30T09:33:50.119005.parquet
- split: latest
path:
- results_2023-08-30T09:33:50.119005.parquet
---
# Dataset Card for Evaluation run of TheBloke/Llama-2-7B-GPTQ
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/Llama-2-7B-GPTQ
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/Llama-2-7B-GPTQ](https://huggingface.co/TheBloke/Llama-2-7B-GPTQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__Llama-2-7B-GPTQ",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-30T09:33:50.119005](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Llama-2-7B-GPTQ/blob/main/results_2023-08-30T09%3A33%3A50.119005.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.44282708077943145,
"acc_stderr": 0.03524402594377024,
"acc_norm": 0.44692317330927667,
"acc_norm_stderr": 0.03523098278385255,
"mc1": 0.2484700122399021,
"mc1_stderr": 0.015127427096520697,
"mc2": 0.39318347243467555,
"mc2_stderr": 0.013670242009997141
},
"harness|arc:challenge|25": {
"acc": 0.4786689419795222,
"acc_stderr": 0.014598087973127108,
"acc_norm": 0.5204778156996587,
"acc_norm_stderr": 0.014599131353035012
},
"harness|hellaswag|10": {
"acc": 0.5760804620593507,
"acc_stderr": 0.004931679059919375,
"acc_norm": 0.7759410476000796,
"acc_norm_stderr": 0.004161089244867776
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464242,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464242
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.039777499346220734,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.039777499346220734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4188679245283019,
"acc_stderr": 0.030365050829115208,
"acc_norm": 0.4188679245283019,
"acc_norm_stderr": 0.030365050829115208
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4097222222222222,
"acc_stderr": 0.04112490974670787,
"acc_norm": 0.4097222222222222,
"acc_norm_stderr": 0.04112490974670787
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952344,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952344
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.37572254335260113,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.37572254335260113,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617746,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617746
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4085106382978723,
"acc_stderr": 0.03213418026701576,
"acc_norm": 0.4085106382978723,
"acc_norm_stderr": 0.03213418026701576
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.04404556157374768,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.04404556157374768
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.041546596717075474,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.041546596717075474
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24338624338624337,
"acc_stderr": 0.022101128787415433,
"acc_norm": 0.24338624338624337,
"acc_norm_stderr": 0.022101128787415433
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.04240799327574924,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.04240799327574924
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.45483870967741935,
"acc_stderr": 0.02832774309156107,
"acc_norm": 0.45483870967741935,
"acc_norm_stderr": 0.02832774309156107
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3054187192118227,
"acc_stderr": 0.03240661565868408,
"acc_norm": 0.3054187192118227,
"acc_norm_stderr": 0.03240661565868408
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5393939393939394,
"acc_stderr": 0.03892207016552013,
"acc_norm": 0.5393939393939394,
"acc_norm_stderr": 0.03892207016552013
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4797979797979798,
"acc_stderr": 0.035594435655639196,
"acc_norm": 0.4797979797979798,
"acc_norm_stderr": 0.035594435655639196
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6476683937823834,
"acc_stderr": 0.03447478286414357,
"acc_norm": 0.6476683937823834,
"acc_norm_stderr": 0.03447478286414357
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.37948717948717947,
"acc_stderr": 0.02460362692409742,
"acc_norm": 0.37948717948717947,
"acc_norm_stderr": 0.02460362692409742
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.02773896963217609,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.02773896963217609
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.031566630992154156,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.031566630992154156
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5834862385321101,
"acc_stderr": 0.021136376504030874,
"acc_norm": 0.5834862385321101,
"acc_norm_stderr": 0.021136376504030874
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.02835321286686342,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.02835321286686342
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.49019607843137253,
"acc_stderr": 0.03508637358630572,
"acc_norm": 0.49019607843137253,
"acc_norm_stderr": 0.03508637358630572
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5021097046413502,
"acc_stderr": 0.032546938018020076,
"acc_norm": 0.5021097046413502,
"acc_norm_stderr": 0.032546938018020076
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5291479820627802,
"acc_stderr": 0.03350073248773404,
"acc_norm": 0.5291479820627802,
"acc_norm_stderr": 0.03350073248773404
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5114503816793893,
"acc_stderr": 0.04384140024078016,
"acc_norm": 0.5114503816793893,
"acc_norm_stderr": 0.04384140024078016
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6115702479338843,
"acc_stderr": 0.044492703500683836,
"acc_norm": 0.6115702479338843,
"acc_norm_stderr": 0.044492703500683836
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.04832853553437055,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.04832853553437055
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.49079754601226994,
"acc_stderr": 0.039277056007874414,
"acc_norm": 0.49079754601226994,
"acc_norm_stderr": 0.039277056007874414
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.4854368932038835,
"acc_stderr": 0.049486373240266376,
"acc_norm": 0.4854368932038835,
"acc_norm_stderr": 0.049486373240266376
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.030882736974138666,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.030882736974138666
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5810983397190294,
"acc_stderr": 0.017643205052377188,
"acc_norm": 0.5810983397190294,
"acc_norm_stderr": 0.017643205052377188
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.49710982658959535,
"acc_stderr": 0.02691864538323901,
"acc_norm": 0.49710982658959535,
"acc_norm_stderr": 0.02691864538323901
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331144,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331144
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.02858034106513829,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.02858034106513829
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5659163987138264,
"acc_stderr": 0.0281502322445356,
"acc_norm": 0.5659163987138264,
"acc_norm_stderr": 0.0281502322445356
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.027815973433878014,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.027815973433878014
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.32978723404255317,
"acc_stderr": 0.028045946942042405,
"acc_norm": 0.32978723404255317,
"acc_norm_stderr": 0.028045946942042405
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3578878748370274,
"acc_stderr": 0.012243563850490309,
"acc_norm": 0.3578878748370274,
"acc_norm_stderr": 0.012243563850490309
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4264705882352941,
"acc_stderr": 0.030042615832714874,
"acc_norm": 0.4264705882352941,
"acc_norm_stderr": 0.030042615832714874
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.020087362076702857,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.020087362076702857
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5,
"acc_stderr": 0.04789131426105757,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04789131426105757
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.49387755102040815,
"acc_stderr": 0.03200682020163908,
"acc_norm": 0.49387755102040815,
"acc_norm_stderr": 0.03200682020163908
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6069651741293532,
"acc_stderr": 0.0345368246603156,
"acc_norm": 0.6069651741293532,
"acc_norm_stderr": 0.0345368246603156
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42168674698795183,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.42168674698795183,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6783625730994152,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.6783625730994152,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2484700122399021,
"mc1_stderr": 0.015127427096520697,
"mc2": 0.39318347243467555,
"mc2_stderr": 0.013670242009997141
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_bigscience__bloomz | 2023-08-29T12:14:44.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of None
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [None](https://huggingface.co/None) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bigscience__bloomz\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-29T12:14:13.875692](https://huggingface.co/datasets/open-llm-leaderboard/details_bigscience__bloomz/blob/main/results_2023-08-29T12%3A14%3A13.875692.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.47917755403252543,\n\
\ \"acc_stderr\": 0.03572101484290109,\n \"acc_norm\": 0.48335485520551164,\n\
\ \"acc_norm_stderr\": 0.0357085480998606,\n \"mc1\": 0.2607099143206854,\n\
\ \"mc1_stderr\": 0.015368841620766372,\n \"mc2\": 0.4393940961026447,\n\
\ \"mc2_stderr\": 0.015292532701908591\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5042662116040956,\n \"acc_stderr\": 0.014610858923956955,\n\
\ \"acc_norm\": 0.5537542662116041,\n \"acc_norm_stderr\": 0.014526705548539982\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5553674566819359,\n\
\ \"acc_stderr\": 0.004959094146471527,\n \"acc_norm\": 0.7523401712806214,\n\
\ \"acc_norm_stderr\": 0.004307709682499536\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45185185185185184,\n\
\ \"acc_stderr\": 0.04299268905480863,\n \"acc_norm\": 0.45185185185185184,\n\
\ \"acc_norm_stderr\": 0.04299268905480863\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04068942293855797,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04068942293855797\n },\n\
\ \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n \
\ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\"\
: 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"\
acc\": 0.5735849056603773,\n \"acc_stderr\": 0.030437794342983052,\n \
\ \"acc_norm\": 0.5735849056603773,\n \"acc_norm_stderr\": 0.030437794342983052\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5277777777777778,\n\
\ \"acc_stderr\": 0.04174752578923185,\n \"acc_norm\": 0.5277777777777778,\n\
\ \"acc_norm_stderr\": 0.04174752578923185\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.45,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.43352601156069365,\n\
\ \"acc_stderr\": 0.03778621079092055,\n \"acc_norm\": 0.43352601156069365,\n\
\ \"acc_norm_stderr\": 0.03778621079092055\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266344,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266344\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4425531914893617,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.4425531914893617,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n\
\ \"acc_stderr\": 0.04404556157374767,\n \"acc_norm\": 0.32456140350877194,\n\
\ \"acc_norm_stderr\": 0.04404556157374767\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.041546596717075474,\n\
\ \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.041546596717075474\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.32275132275132273,\n \"acc_stderr\": 0.024078943243597016,\n \"\
acc_norm\": 0.32275132275132273,\n \"acc_norm_stderr\": 0.024078943243597016\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.041349130183033156,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.041349130183033156\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.532258064516129,\n \"acc_stderr\": 0.028384747788813332,\n \"\
acc_norm\": 0.532258064516129,\n \"acc_norm_stderr\": 0.028384747788813332\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.39408866995073893,\n \"acc_stderr\": 0.034381579670365446,\n \"\
acc_norm\": 0.39408866995073893,\n \"acc_norm_stderr\": 0.034381579670365446\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.49696969696969695,\n \"acc_stderr\": 0.03904272341431857,\n\
\ \"acc_norm\": 0.49696969696969695,\n \"acc_norm_stderr\": 0.03904272341431857\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6616161616161617,\n \"acc_stderr\": 0.03371124142626303,\n \"\
acc_norm\": 0.6616161616161617,\n \"acc_norm_stderr\": 0.03371124142626303\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.616580310880829,\n \"acc_stderr\": 0.03508984236295342,\n\
\ \"acc_norm\": 0.616580310880829,\n \"acc_norm_stderr\": 0.03508984236295342\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.47435897435897434,\n \"acc_stderr\": 0.02531764972644865,\n\
\ \"acc_norm\": 0.47435897435897434,\n \"acc_norm_stderr\": 0.02531764972644865\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.29259259259259257,\n \"acc_stderr\": 0.02773896963217609,\n \
\ \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.02773896963217609\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5672268907563025,\n \"acc_stderr\": 0.032183581077426124,\n\
\ \"acc_norm\": 0.5672268907563025,\n \"acc_norm_stderr\": 0.032183581077426124\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.0386155754625517,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.0386155754625517\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6623853211009174,\n \"acc_stderr\": 0.020275265986638924,\n \"\
acc_norm\": 0.6623853211009174,\n \"acc_norm_stderr\": 0.020275265986638924\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4537037037037037,\n \"acc_stderr\": 0.033953227263757976,\n \"\
acc_norm\": 0.4537037037037037,\n \"acc_norm_stderr\": 0.033953227263757976\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5196078431372549,\n \"acc_stderr\": 0.03506612560524866,\n \"\
acc_norm\": 0.5196078431372549,\n \"acc_norm_stderr\": 0.03506612560524866\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6919831223628692,\n \"acc_stderr\": 0.0300523893356057,\n \
\ \"acc_norm\": 0.6919831223628692,\n \"acc_norm_stderr\": 0.0300523893356057\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4977578475336323,\n\
\ \"acc_stderr\": 0.033557465352232634,\n \"acc_norm\": 0.4977578475336323,\n\
\ \"acc_norm_stderr\": 0.033557465352232634\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4961832061068702,\n \"acc_stderr\": 0.043851623256015534,\n\
\ \"acc_norm\": 0.4961832061068702,\n \"acc_norm_stderr\": 0.043851623256015534\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.4793388429752066,\n \"acc_stderr\": 0.04560456086387235,\n \"\
acc_norm\": 0.4793388429752066,\n \"acc_norm_stderr\": 0.04560456086387235\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5740740740740741,\n\
\ \"acc_stderr\": 0.047803436269367894,\n \"acc_norm\": 0.5740740740740741,\n\
\ \"acc_norm_stderr\": 0.047803436269367894\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.49079754601226994,\n \"acc_stderr\": 0.03927705600787443,\n\
\ \"acc_norm\": 0.49079754601226994,\n \"acc_norm_stderr\": 0.03927705600787443\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.047184714852195886,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.047184714852195886\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6310679611650486,\n \"acc_stderr\": 0.0477761518115674,\n\
\ \"acc_norm\": 0.6310679611650486,\n \"acc_norm_stderr\": 0.0477761518115674\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.717948717948718,\n\
\ \"acc_stderr\": 0.02948036054954119,\n \"acc_norm\": 0.717948717948718,\n\
\ \"acc_norm_stderr\": 0.02948036054954119\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6232439335887612,\n\
\ \"acc_stderr\": 0.01732829290730305,\n \"acc_norm\": 0.6232439335887612,\n\
\ \"acc_norm_stderr\": 0.01732829290730305\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.026919095102908273,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.026919095102908273\n \
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2770949720670391,\n\
\ \"acc_stderr\": 0.014968772435812145,\n \"acc_norm\": 0.2770949720670391,\n\
\ \"acc_norm_stderr\": 0.014968772435812145\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.49673202614379086,\n \"acc_stderr\": 0.02862930519400354,\n\
\ \"acc_norm\": 0.49673202614379086,\n \"acc_norm_stderr\": 0.02862930519400354\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4919614147909968,\n\
\ \"acc_stderr\": 0.028394421370984545,\n \"acc_norm\": 0.4919614147909968,\n\
\ \"acc_norm_stderr\": 0.028394421370984545\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4691358024691358,\n \"acc_stderr\": 0.02776768960683393,\n\
\ \"acc_norm\": 0.4691358024691358,\n \"acc_norm_stderr\": 0.02776768960683393\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.34397163120567376,\n \"acc_stderr\": 0.028338017428611324,\n \
\ \"acc_norm\": 0.34397163120567376,\n \"acc_norm_stderr\": 0.028338017428611324\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3116036505867014,\n\
\ \"acc_stderr\": 0.011829039182849648,\n \"acc_norm\": 0.3116036505867014,\n\
\ \"acc_norm_stderr\": 0.011829039182849648\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4742647058823529,\n \"acc_stderr\": 0.030332578094555033,\n\
\ \"acc_norm\": 0.4742647058823529,\n \"acc_norm_stderr\": 0.030332578094555033\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4950980392156863,\n \"acc_stderr\": 0.020226862710039473,\n \
\ \"acc_norm\": 0.4950980392156863,\n \"acc_norm_stderr\": 0.020226862710039473\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4909090909090909,\n\
\ \"acc_stderr\": 0.04788339768702861,\n \"acc_norm\": 0.4909090909090909,\n\
\ \"acc_norm_stderr\": 0.04788339768702861\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5265306122448979,\n \"acc_stderr\": 0.03196412734523272,\n\
\ \"acc_norm\": 0.5265306122448979,\n \"acc_norm_stderr\": 0.03196412734523272\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5671641791044776,\n\
\ \"acc_stderr\": 0.0350349092367328,\n \"acc_norm\": 0.5671641791044776,\n\
\ \"acc_norm_stderr\": 0.0350349092367328\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.4578313253012048,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.03834234744164993,\n\
\ \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.03834234744164993\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2607099143206854,\n\
\ \"mc1_stderr\": 0.015368841620766372,\n \"mc2\": 0.4393940961026447,\n\
\ \"mc2_stderr\": 0.015292532701908591\n }\n}\n```"
repo_url: https://huggingface.co/None
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|arc:challenge|25_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hellaswag|10_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T12:14:13.875692.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T12:14:13.875692.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T12:14:13.875692.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T12:14:13.875692.parquet'
- config_name: results
data_files:
- split: 2023_08_29T12_14_13.875692
path:
- results_2023-08-29T12:14:13.875692.parquet
- split: latest
path:
- results_2023-08-29T12:14:13.875692.parquet
---
# Dataset Card for Evaluation run of None
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/None
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [None](https://huggingface.co/None) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bigscience__bloomz",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-29T12:14:13.875692](https://huggingface.co/datasets/open-llm-leaderboard/details_bigscience__bloomz/blob/main/results_2023-08-29T12%3A14%3A13.875692.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.47917755403252543,
"acc_stderr": 0.03572101484290109,
"acc_norm": 0.48335485520551164,
"acc_norm_stderr": 0.0357085480998606,
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766372,
"mc2": 0.4393940961026447,
"mc2_stderr": 0.015292532701908591
},
"harness|arc:challenge|25": {
"acc": 0.5042662116040956,
"acc_stderr": 0.014610858923956955,
"acc_norm": 0.5537542662116041,
"acc_norm_stderr": 0.014526705548539982
},
"harness|hellaswag|10": {
"acc": 0.5553674566819359,
"acc_stderr": 0.004959094146471527,
"acc_norm": 0.7523401712806214,
"acc_norm_stderr": 0.004307709682499536
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45185185185185184,
"acc_stderr": 0.04299268905480863,
"acc_norm": 0.45185185185185184,
"acc_norm_stderr": 0.04299268905480863
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5,
"acc_stderr": 0.04068942293855797,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04068942293855797
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5735849056603773,
"acc_stderr": 0.030437794342983052,
"acc_norm": 0.5735849056603773,
"acc_norm_stderr": 0.030437794342983052
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.04174752578923185,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.04174752578923185
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.45,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.43352601156069365,
"acc_stderr": 0.03778621079092055,
"acc_norm": 0.43352601156069365,
"acc_norm_stderr": 0.03778621079092055
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266344,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266344
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4425531914893617,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.4425531914893617,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.04404556157374767,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.04404556157374767
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.041546596717075474,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.041546596717075474
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.32275132275132273,
"acc_stderr": 0.024078943243597016,
"acc_norm": 0.32275132275132273,
"acc_norm_stderr": 0.024078943243597016
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.041349130183033156,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.041349130183033156
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.532258064516129,
"acc_stderr": 0.028384747788813332,
"acc_norm": 0.532258064516129,
"acc_norm_stderr": 0.028384747788813332
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.39408866995073893,
"acc_stderr": 0.034381579670365446,
"acc_norm": 0.39408866995073893,
"acc_norm_stderr": 0.034381579670365446
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.49696969696969695,
"acc_stderr": 0.03904272341431857,
"acc_norm": 0.49696969696969695,
"acc_norm_stderr": 0.03904272341431857
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6616161616161617,
"acc_stderr": 0.03371124142626303,
"acc_norm": 0.6616161616161617,
"acc_norm_stderr": 0.03371124142626303
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.616580310880829,
"acc_stderr": 0.03508984236295342,
"acc_norm": 0.616580310880829,
"acc_norm_stderr": 0.03508984236295342
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.47435897435897434,
"acc_stderr": 0.02531764972644865,
"acc_norm": 0.47435897435897434,
"acc_norm_stderr": 0.02531764972644865
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.02773896963217609,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.02773896963217609
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5672268907563025,
"acc_stderr": 0.032183581077426124,
"acc_norm": 0.5672268907563025,
"acc_norm_stderr": 0.032183581077426124
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.0386155754625517,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.0386155754625517
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6623853211009174,
"acc_stderr": 0.020275265986638924,
"acc_norm": 0.6623853211009174,
"acc_norm_stderr": 0.020275265986638924
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.033953227263757976,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.033953227263757976
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5196078431372549,
"acc_stderr": 0.03506612560524866,
"acc_norm": 0.5196078431372549,
"acc_norm_stderr": 0.03506612560524866
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6919831223628692,
"acc_stderr": 0.0300523893356057,
"acc_norm": 0.6919831223628692,
"acc_norm_stderr": 0.0300523893356057
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4977578475336323,
"acc_stderr": 0.033557465352232634,
"acc_norm": 0.4977578475336323,
"acc_norm_stderr": 0.033557465352232634
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4961832061068702,
"acc_stderr": 0.043851623256015534,
"acc_norm": 0.4961832061068702,
"acc_norm_stderr": 0.043851623256015534
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.4793388429752066,
"acc_stderr": 0.04560456086387235,
"acc_norm": 0.4793388429752066,
"acc_norm_stderr": 0.04560456086387235
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.047803436269367894,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.047803436269367894
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.49079754601226994,
"acc_stderr": 0.03927705600787443,
"acc_norm": 0.49079754601226994,
"acc_norm_stderr": 0.03927705600787443
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.047184714852195886,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.047184714852195886
},
"harness|hendrycksTest-management|5": {
"acc": 0.6310679611650486,
"acc_stderr": 0.0477761518115674,
"acc_norm": 0.6310679611650486,
"acc_norm_stderr": 0.0477761518115674
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.717948717948718,
"acc_stderr": 0.02948036054954119,
"acc_norm": 0.717948717948718,
"acc_norm_stderr": 0.02948036054954119
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6232439335887612,
"acc_stderr": 0.01732829290730305,
"acc_norm": 0.6232439335887612,
"acc_norm_stderr": 0.01732829290730305
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5,
"acc_stderr": 0.026919095102908273,
"acc_norm": 0.5,
"acc_norm_stderr": 0.026919095102908273
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2770949720670391,
"acc_stderr": 0.014968772435812145,
"acc_norm": 0.2770949720670391,
"acc_norm_stderr": 0.014968772435812145
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.49673202614379086,
"acc_stderr": 0.02862930519400354,
"acc_norm": 0.49673202614379086,
"acc_norm_stderr": 0.02862930519400354
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4919614147909968,
"acc_stderr": 0.028394421370984545,
"acc_norm": 0.4919614147909968,
"acc_norm_stderr": 0.028394421370984545
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4691358024691358,
"acc_stderr": 0.02776768960683393,
"acc_norm": 0.4691358024691358,
"acc_norm_stderr": 0.02776768960683393
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.34397163120567376,
"acc_stderr": 0.028338017428611324,
"acc_norm": 0.34397163120567376,
"acc_norm_stderr": 0.028338017428611324
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3116036505867014,
"acc_stderr": 0.011829039182849648,
"acc_norm": 0.3116036505867014,
"acc_norm_stderr": 0.011829039182849648
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4742647058823529,
"acc_stderr": 0.030332578094555033,
"acc_norm": 0.4742647058823529,
"acc_norm_stderr": 0.030332578094555033
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4950980392156863,
"acc_stderr": 0.020226862710039473,
"acc_norm": 0.4950980392156863,
"acc_norm_stderr": 0.020226862710039473
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4909090909090909,
"acc_stderr": 0.04788339768702861,
"acc_norm": 0.4909090909090909,
"acc_norm_stderr": 0.04788339768702861
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5265306122448979,
"acc_stderr": 0.03196412734523272,
"acc_norm": 0.5265306122448979,
"acc_norm_stderr": 0.03196412734523272
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5671641791044776,
"acc_stderr": 0.0350349092367328,
"acc_norm": 0.5671641791044776,
"acc_norm_stderr": 0.0350349092367328
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.03834234744164993,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.03834234744164993
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766372,
"mc2": 0.4393940961026447,
"mc2_stderr": 0.015292532701908591
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
geekyrakshit/tanjiro-dreambooth-dataset | 2023-08-29T12:47:29.000Z | [
"license:unknown",
"region:us"
] | geekyrakshit | null | null | null | 0 | 0 | ---
license: unknown
---
|
open-llm-leaderboard/details_bigscience__bloom | 2023-08-29T12:20:26.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of None
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [None](https://huggingface.co/None) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bigscience__bloom\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-29T12:19:54.390376](https://huggingface.co/datasets/open-llm-leaderboard/details_bigscience__bloom/blob/main/results_2023-08-29T12%3A19%3A54.390376.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.315597821758106,\n\
\ \"acc_stderr\": 0.0334554445358342,\n \"acc_norm\": 0.31957868125391004,\n\
\ \"acc_norm_stderr\": 0.03344403068302842,\n \"mc1\": 0.2521419828641371,\n\
\ \"mc1_stderr\": 0.015201522246299962,\n \"mc2\": 0.3975962282334165,\n\
\ \"mc2_stderr\": 0.013579754303009808\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4658703071672355,\n \"acc_stderr\": 0.014577311315231102,\n\
\ \"acc_norm\": 0.5042662116040956,\n \"acc_norm_stderr\": 0.014610858923956948\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5676160127464649,\n\
\ \"acc_stderr\": 0.004943945069611452,\n \"acc_norm\": 0.7640908185620394,\n\
\ \"acc_norm_stderr\": 0.0042369801453443065\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.35555555555555557,\n\
\ \"acc_stderr\": 0.04135176749720386,\n \"acc_norm\": 0.35555555555555557,\n\
\ \"acc_norm_stderr\": 0.04135176749720386\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03523807393012047,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03523807393012047\n \
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.41,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.3169811320754717,\n \"acc_stderr\": 0.028637235639800925,\n\
\ \"acc_norm\": 0.3169811320754717,\n \"acc_norm_stderr\": 0.028637235639800925\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2847222222222222,\n\
\ \"acc_stderr\": 0.03773809990686934,\n \"acc_norm\": 0.2847222222222222,\n\
\ \"acc_norm_stderr\": 0.03773809990686934\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653694\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24855491329479767,\n\
\ \"acc_stderr\": 0.03295304696818318,\n \"acc_norm\": 0.24855491329479767,\n\
\ \"acc_norm_stderr\": 0.03295304696818318\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617749,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617749\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n\
\ \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3276595744680851,\n \"acc_stderr\": 0.030683020843231004,\n\
\ \"acc_norm\": 0.3276595744680851,\n \"acc_norm_stderr\": 0.030683020843231004\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.040493392977481425,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.040493392977481425\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.31724137931034485,\n \"acc_stderr\": 0.03878352372138623,\n\
\ \"acc_norm\": 0.31724137931034485,\n \"acc_norm_stderr\": 0.03878352372138623\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2804232804232804,\n \"acc_stderr\": 0.02313528797432563,\n \"\
acc_norm\": 0.2804232804232804,\n \"acc_norm_stderr\": 0.02313528797432563\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n\
\ \"acc_stderr\": 0.03852273364924315,\n \"acc_norm\": 0.24603174603174602,\n\
\ \"acc_norm_stderr\": 0.03852273364924315\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.2967741935483871,\n \"acc_stderr\": 0.02598850079241188,\n \"\
acc_norm\": 0.2967741935483871,\n \"acc_norm_stderr\": 0.02598850079241188\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.28078817733990147,\n \"acc_stderr\": 0.03161856335358609,\n \"\
acc_norm\": 0.28078817733990147,\n \"acc_norm_stderr\": 0.03161856335358609\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2787878787878788,\n \"acc_stderr\": 0.03501438706296781,\n\
\ \"acc_norm\": 0.2787878787878788,\n \"acc_norm_stderr\": 0.03501438706296781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3787878787878788,\n \"acc_stderr\": 0.03456088731993747,\n \"\
acc_norm\": 0.3787878787878788,\n \"acc_norm_stderr\": 0.03456088731993747\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.29533678756476683,\n \"acc_stderr\": 0.0329229663915514,\n\
\ \"acc_norm\": 0.29533678756476683,\n \"acc_norm_stderr\": 0.0329229663915514\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2230769230769231,\n \"acc_stderr\": 0.021107730127243998,\n\
\ \"acc_norm\": 0.2230769230769231,\n \"acc_norm_stderr\": 0.021107730127243998\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085622,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085622\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.24789915966386555,\n \"acc_stderr\": 0.028047967224176896,\n\
\ \"acc_norm\": 0.24789915966386555,\n \"acc_norm_stderr\": 0.028047967224176896\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.24503311258278146,\n \"acc_stderr\": 0.03511807571804726,\n \"\
acc_norm\": 0.24503311258278146,\n \"acc_norm_stderr\": 0.03511807571804726\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.4,\n \"acc_stderr\": 0.021004201260420078,\n \"acc_norm\": 0.4,\n\
\ \"acc_norm_stderr\": 0.021004201260420078\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.21296296296296297,\n \"acc_stderr\": 0.02792096314799366,\n\
\ \"acc_norm\": 0.21296296296296297,\n \"acc_norm_stderr\": 0.02792096314799366\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.30392156862745096,\n \"acc_stderr\": 0.032282103870378914,\n \"\
acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.032282103870378914\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.3206751054852321,\n \"acc_stderr\": 0.03038193194999041,\n \
\ \"acc_norm\": 0.3206751054852321,\n \"acc_norm_stderr\": 0.03038193194999041\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3901345291479821,\n\
\ \"acc_stderr\": 0.03273766725459157,\n \"acc_norm\": 0.3901345291479821,\n\
\ \"acc_norm_stderr\": 0.03273766725459157\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3282442748091603,\n \"acc_stderr\": 0.041184385658062976,\n\
\ \"acc_norm\": 0.3282442748091603,\n \"acc_norm_stderr\": 0.041184385658062976\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.371900826446281,\n \"acc_stderr\": 0.04412015806624502,\n \"acc_norm\"\
: 0.371900826446281,\n \"acc_norm_stderr\": 0.04412015806624502\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.37962962962962965,\n\
\ \"acc_stderr\": 0.04691521224077742,\n \"acc_norm\": 0.37962962962962965,\n\
\ \"acc_norm_stderr\": 0.04691521224077742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2822085889570552,\n \"acc_stderr\": 0.03536117886664743,\n\
\ \"acc_norm\": 0.2822085889570552,\n \"acc_norm_stderr\": 0.03536117886664743\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.26785714285714285,\n\
\ \"acc_stderr\": 0.04203277291467763,\n \"acc_norm\": 0.26785714285714285,\n\
\ \"acc_norm_stderr\": 0.04203277291467763\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2621359223300971,\n \"acc_stderr\": 0.043546310772605956,\n\
\ \"acc_norm\": 0.2621359223300971,\n \"acc_norm_stderr\": 0.043546310772605956\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.4230769230769231,\n\
\ \"acc_stderr\": 0.032366121762202014,\n \"acc_norm\": 0.4230769230769231,\n\
\ \"acc_norm_stderr\": 0.032366121762202014\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.40102171136653897,\n\
\ \"acc_stderr\": 0.017526133150124572,\n \"acc_norm\": 0.40102171136653897,\n\
\ \"acc_norm_stderr\": 0.017526133150124572\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.3670520231213873,\n \"acc_stderr\": 0.025950054337654096,\n\
\ \"acc_norm\": 0.3670520231213873,\n \"acc_norm_stderr\": 0.025950054337654096\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.025829163272757482,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.025829163272757482\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.35691318327974275,\n\
\ \"acc_stderr\": 0.027210420375934023,\n \"acc_norm\": 0.35691318327974275,\n\
\ \"acc_norm_stderr\": 0.027210420375934023\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.32407407407407407,\n \"acc_stderr\": 0.026041766202717163,\n\
\ \"acc_norm\": 0.32407407407407407,\n \"acc_norm_stderr\": 0.026041766202717163\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2907801418439716,\n \"acc_stderr\": 0.027090664368353178,\n \
\ \"acc_norm\": 0.2907801418439716,\n \"acc_norm_stderr\": 0.027090664368353178\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.27835723598435463,\n\
\ \"acc_stderr\": 0.011446990197380982,\n \"acc_norm\": 0.27835723598435463,\n\
\ \"acc_norm_stderr\": 0.011446990197380982\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.24632352941176472,\n \"acc_stderr\": 0.02617343857052,\n\
\ \"acc_norm\": 0.24632352941176472,\n \"acc_norm_stderr\": 0.02617343857052\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.32189542483660133,\n \"acc_stderr\": 0.018901015322093085,\n \
\ \"acc_norm\": 0.32189542483660133,\n \"acc_norm_stderr\": 0.018901015322093085\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.33636363636363636,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.33636363636363636,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.31020408163265306,\n \"acc_stderr\": 0.02961345987248438,\n\
\ \"acc_norm\": 0.31020408163265306,\n \"acc_norm_stderr\": 0.02961345987248438\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.32338308457711445,\n\
\ \"acc_stderr\": 0.03307615947979033,\n \"acc_norm\": 0.32338308457711445,\n\
\ \"acc_norm_stderr\": 0.03307615947979033\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3072289156626506,\n\
\ \"acc_stderr\": 0.03591566797824663,\n \"acc_norm\": 0.3072289156626506,\n\
\ \"acc_norm_stderr\": 0.03591566797824663\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.0381107966983353,\n\
\ \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.0381107966983353\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2521419828641371,\n\
\ \"mc1_stderr\": 0.015201522246299962,\n \"mc2\": 0.3975962282334165,\n\
\ \"mc2_stderr\": 0.013579754303009808\n }\n}\n```"
repo_url: https://huggingface.co/None
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|arc:challenge|25_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hellaswag|10_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T12:19:54.390376.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T12:19:54.390376.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T12:19:54.390376.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T12:19:54.390376.parquet'
- config_name: results
data_files:
- split: 2023_08_29T12_19_54.390376
path:
- results_2023-08-29T12:19:54.390376.parquet
- split: latest
path:
- results_2023-08-29T12:19:54.390376.parquet
---
# Dataset Card for Evaluation run of None
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/None
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [None](https://huggingface.co/None) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bigscience__bloom",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-29T12:19:54.390376](https://huggingface.co/datasets/open-llm-leaderboard/details_bigscience__bloom/blob/main/results_2023-08-29T12%3A19%3A54.390376.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.315597821758106,
"acc_stderr": 0.0334554445358342,
"acc_norm": 0.31957868125391004,
"acc_norm_stderr": 0.03344403068302842,
"mc1": 0.2521419828641371,
"mc1_stderr": 0.015201522246299962,
"mc2": 0.3975962282334165,
"mc2_stderr": 0.013579754303009808
},
"harness|arc:challenge|25": {
"acc": 0.4658703071672355,
"acc_stderr": 0.014577311315231102,
"acc_norm": 0.5042662116040956,
"acc_norm_stderr": 0.014610858923956948
},
"harness|hellaswag|10": {
"acc": 0.5676160127464649,
"acc_stderr": 0.004943945069611452,
"acc_norm": 0.7640908185620394,
"acc_norm_stderr": 0.0042369801453443065
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.04135176749720386,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.04135176749720386
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.25,
"acc_stderr": 0.03523807393012047,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03523807393012047
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3169811320754717,
"acc_stderr": 0.028637235639800925,
"acc_norm": 0.3169811320754717,
"acc_norm_stderr": 0.028637235639800925
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2847222222222222,
"acc_stderr": 0.03773809990686934,
"acc_norm": 0.2847222222222222,
"acc_norm_stderr": 0.03773809990686934
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.03295304696818318,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.03295304696818318
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617749,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617749
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3276595744680851,
"acc_stderr": 0.030683020843231004,
"acc_norm": 0.3276595744680851,
"acc_norm_stderr": 0.030683020843231004
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.040493392977481425,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.040493392977481425
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.31724137931034485,
"acc_stderr": 0.03878352372138623,
"acc_norm": 0.31724137931034485,
"acc_norm_stderr": 0.03878352372138623
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2804232804232804,
"acc_stderr": 0.02313528797432563,
"acc_norm": 0.2804232804232804,
"acc_norm_stderr": 0.02313528797432563
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.03852273364924315,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.03852273364924315
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2967741935483871,
"acc_stderr": 0.02598850079241188,
"acc_norm": 0.2967741935483871,
"acc_norm_stderr": 0.02598850079241188
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.28078817733990147,
"acc_stderr": 0.03161856335358609,
"acc_norm": 0.28078817733990147,
"acc_norm_stderr": 0.03161856335358609
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2787878787878788,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.2787878787878788,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3787878787878788,
"acc_stderr": 0.03456088731993747,
"acc_norm": 0.3787878787878788,
"acc_norm_stderr": 0.03456088731993747
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.29533678756476683,
"acc_stderr": 0.0329229663915514,
"acc_norm": 0.29533678756476683,
"acc_norm_stderr": 0.0329229663915514
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2230769230769231,
"acc_stderr": 0.021107730127243998,
"acc_norm": 0.2230769230769231,
"acc_norm_stderr": 0.021107730127243998
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085622,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085622
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24789915966386555,
"acc_stderr": 0.028047967224176896,
"acc_norm": 0.24789915966386555,
"acc_norm_stderr": 0.028047967224176896
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.24503311258278146,
"acc_stderr": 0.03511807571804726,
"acc_norm": 0.24503311258278146,
"acc_norm_stderr": 0.03511807571804726
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.4,
"acc_stderr": 0.021004201260420078,
"acc_norm": 0.4,
"acc_norm_stderr": 0.021004201260420078
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.02792096314799366,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.02792096314799366
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.032282103870378914,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.032282103870378914
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3206751054852321,
"acc_stderr": 0.03038193194999041,
"acc_norm": 0.3206751054852321,
"acc_norm_stderr": 0.03038193194999041
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3901345291479821,
"acc_stderr": 0.03273766725459157,
"acc_norm": 0.3901345291479821,
"acc_norm_stderr": 0.03273766725459157
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3282442748091603,
"acc_stderr": 0.041184385658062976,
"acc_norm": 0.3282442748091603,
"acc_norm_stderr": 0.041184385658062976
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.371900826446281,
"acc_stderr": 0.04412015806624502,
"acc_norm": 0.371900826446281,
"acc_norm_stderr": 0.04412015806624502
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.37962962962962965,
"acc_stderr": 0.04691521224077742,
"acc_norm": 0.37962962962962965,
"acc_norm_stderr": 0.04691521224077742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2822085889570552,
"acc_stderr": 0.03536117886664743,
"acc_norm": 0.2822085889570552,
"acc_norm_stderr": 0.03536117886664743
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.26785714285714285,
"acc_stderr": 0.04203277291467763,
"acc_norm": 0.26785714285714285,
"acc_norm_stderr": 0.04203277291467763
},
"harness|hendrycksTest-management|5": {
"acc": 0.2621359223300971,
"acc_stderr": 0.043546310772605956,
"acc_norm": 0.2621359223300971,
"acc_norm_stderr": 0.043546310772605956
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.4230769230769231,
"acc_stderr": 0.032366121762202014,
"acc_norm": 0.4230769230769231,
"acc_norm_stderr": 0.032366121762202014
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.40102171136653897,
"acc_stderr": 0.017526133150124572,
"acc_norm": 0.40102171136653897,
"acc_norm_stderr": 0.017526133150124572
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3670520231213873,
"acc_stderr": 0.025950054337654096,
"acc_norm": 0.3670520231213873,
"acc_norm_stderr": 0.025950054337654096
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.025829163272757482,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.025829163272757482
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.35691318327974275,
"acc_stderr": 0.027210420375934023,
"acc_norm": 0.35691318327974275,
"acc_norm_stderr": 0.027210420375934023
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.32407407407407407,
"acc_stderr": 0.026041766202717163,
"acc_norm": 0.32407407407407407,
"acc_norm_stderr": 0.026041766202717163
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2907801418439716,
"acc_stderr": 0.027090664368353178,
"acc_norm": 0.2907801418439716,
"acc_norm_stderr": 0.027090664368353178
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.27835723598435463,
"acc_stderr": 0.011446990197380982,
"acc_norm": 0.27835723598435463,
"acc_norm_stderr": 0.011446990197380982
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.24632352941176472,
"acc_stderr": 0.02617343857052,
"acc_norm": 0.24632352941176472,
"acc_norm_stderr": 0.02617343857052
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.32189542483660133,
"acc_stderr": 0.018901015322093085,
"acc_norm": 0.32189542483660133,
"acc_norm_stderr": 0.018901015322093085
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.33636363636363636,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.33636363636363636,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.31020408163265306,
"acc_stderr": 0.02961345987248438,
"acc_norm": 0.31020408163265306,
"acc_norm_stderr": 0.02961345987248438
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.32338308457711445,
"acc_stderr": 0.03307615947979033,
"acc_norm": 0.32338308457711445,
"acc_norm_stderr": 0.03307615947979033
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3072289156626506,
"acc_stderr": 0.03591566797824663,
"acc_norm": 0.3072289156626506,
"acc_norm_stderr": 0.03591566797824663
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.0381107966983353,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.0381107966983353
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2521419828641371,
"mc1_stderr": 0.015201522246299962,
"mc2": 0.3975962282334165,
"mc2_stderr": 0.013579754303009808
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
neotonicsgummies/Neotonics-Gummies-Reviews | 2023-08-29T12:35:55.000Z | [
"region:us"
] | neotonicsgummies | null | null | null | 0 | 0 | <h2 style="text-align: center;"><a href="https://sale365day.com/get-neotonics-gummies"><span style="color: #003300;">Click Here -- Official Website -- Order Now</span></a></h2>
<h2 style="text-align: center;"><span style="color: red;">⚠️Beware Of Fake Websites⚠️</span></h2>
<p style="color: #848c9b; font-family: 'Open Sans', sans-serif; font-size: 16px; text-align: left;"><strong><span style="color: #ff00fe;">✔For Order Official Website -</span> <a href="https://sale365day.com/get-neotonics-gummies">https://sale365day.com/get-neotonics-gummies</a></strong></p>
<p style="color: #848c9b; font-family: 'Open Sans', sans-serif; font-size: 16px; text-align: left;"><strong><span style="color: #800180;">✔Product Name -</span> <span style="color: red;"><a href="https://medium.com/@neotonicsgummies/neotonics-gummies-reviews-scam-analyzed-and-exposed-by-medical-experts-real-user-responses-2023-ba6cf7904a5e">Neotonics Gummies</a></span></strong></p>
<p style="color: #848c9b; font-family: 'Open Sans', sans-serif; font-size: 16px; text-align: left;"><strong><span style="color: #2b00fe;">✔Side Effect -</span> <span style="color: #800180;">No Side Effects<br /></span></strong></p>
<p style="color: #848c9b; font-family: 'Open Sans', sans-serif; font-size: 16px; text-align: left;"><strong><span style="color: #274e13;">✔Availability - </span><a href="https://sale365day.com/get-neotonics-gummies">Online</a></strong></p>
<p><strong><span style="color: #274e13;">✔</span></strong><strong><span style="color: #274e13;">Rating -</span>⭐⭐⭐⭐⭐</strong></p>
<p><a href="https://sale365day.com/get-neotonics-gummies"><span style="font-size: large;"><strong><span style="color: #274e13;">Hurry </span><span style="color: #274e13;">U</span><span style="color: #274e13;">p - </span><span style="color: #274e13;">Limi</span><span style="color: #274e13;">ted Time Offer - Purchase Now</span></strong></span></a></p>
<p><a href="https://sale365day.com/get-neotonics-gummies"><span style="font-size: large;"><strong><span style="color: #274e13;">Hurry Up</span><span style="color: #274e13;"> - L</span><span style="color: #274e13;">imited Time Offer - Purchase Now</span></strong></span></a></p>
<p><a href="https://sale365day.com/get-neotonics-gummies"><span style="font-size: large;"><strong><span style="color: #274e13;">Hu</span><span style="color: #274e13;">rry Up - </span><span style="color: #274e13;">Limited Time Offer - Purchase Now</span></strong></span></a><strong> <br /></strong></p>
<p class="articleHeading mb-0"><strong><a href="https://www.forexagone.com/forum/experiences-trading/neotonics-gummies-reviews-formulated-with-100-pure-ingredients-that-helps-in-skin-and-gut-health-64727#161915">Neotonics Reviews</a> - What consumer says about Neotonics Gummies? Consumers shares their experience about Neotonics and also about uses, benefits, price. For more informations check official website.</strong></p>
<p style="text-align: justify;">It takes a constant effort to keep skin young and healthy, which is why many men and women are looking for solutions they can apply to their skin to reduce wrinkles. and even skin tone. Although these efforts may bring some improvement to some extent, only inner rejuvenation can improve the skin completely. <a href="https://infogram.com/neotonics-gummies-reviews-beware-scam-alert-is-it-works-or-fake-claim-1ho16vo938nyx4n">Neotonics</a> is an alternative to choosing another beauty supplement that is said to boost collagen formation.</p>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://sale365day.com/get-neotonics-gummies"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgnZWOVZay38F-Ki7O8-16o5bfe-o1GbRlMopBb2ufa28w_Oa8MoN91qdMdUOWtquKbE4vmvpi-Yveoz5jdHyvd5VFvCukIi0g0LVgsDqlQvdnVlSglw5gqr6ncIIi2RgTpVJmk5ZEHjN8ggOz4toBmXN5btueKQdW-vVIcUJhnB2Ji20HCwzTvvJEL/w640-h276/qdwdffrg.JPG" alt="" width="640" height="276" border="0" data-original-height="527" data-original-width="1224" /></a></div>
<p style="text-align: justify;"><a href="https://colab.research.google.com/drive/15NmvvbrD78f0uRj4fGrO51f11TTRBrsF">Neotonics</a> products use probiotic bacteria and all-natural substances that can help people regain balance and eliminate all toxins in the stomach. While responding to the rapid aging of the skin, this balance also addresses the gut microbiome, the main contributor to this rapid aging. The gut microbiota has a fundamental beneficial function in supporting the immune system and cellular health. The only way for a client to have a healthy immune system, youthful skin and a perfectly balanced digestive system is to figure out how to improve the environment when it is out of balance. </p>
<p style="text-align: justify;"><strong><u><a href="https://sale365day.com/get-neotonics-gummies" target="_blank" rel="nofollow noopener"><span style="font-size: large;">Limited Discount: Get Neotonics at 70% off on the official website!</span></a></u></strong></p>
<h2 style="text-align: justify;"><strong>Who create Neotonics gummies for skin & gut?</strong></h2>
<p style="text-align: justify;">Medical professionals with extensive expertise and training in skin health have come up with the <a href="https://www.facebook.com/profile.php?id=61550985140645">Neotonics formula</a>. After discovering that stomach and skin health are negatively correlated, years of research and testing have resulted in perfecting every ingredient used in <a href="https://neotonics-gummies-reviews-2023.webflow.io/">Neotonics</a>.</p>
<p style="text-align: justify;"><a href="https://lookerstudio.google.com/reporting/5ce1a1c5-664e-46fd-aecc-1cfec6b95a8e/page/LG7aD">Neotonics</a> properties are developed in a GMP certified facility and are based on the same idea. To ensure the formulation's effectiveness and safety, it has been repeatedly evaluated in third-party laboratories and in clinical settings. So Neotonics is the best choice if you are looking for a safe yet effective solution to support your skin health.</p>
<p style="text-align: justify;"><strong><u><a href="https://sale365day.com/get-neotonics-gummies" target="_blank" rel="nofollow noopener"><span style="font-size: large;">ORDER Neotonics at the LOWEST Price from its Official Website</span></a></u></strong></p>
<h2 style="text-align: justify;"><strong>How does Neotonics gummies for skin & gut?</strong></h2>
<p style="text-align: justify;">Along with nine all-natural ingredients, <a href="https://neotonicsgummiesreviews.bandcamp.com/track/neotonics-gummies-reviews-breaking-news-shocking-customer-complaints-exposed-must-read">Neotonics</a> contains 500 million bacteria units. The gut microbiome and general health both benefit from this combination. It's simple to take vitamins to improve gut health, but not all supplements are created equal.</p>
<p style="text-align: justify;">According to our research, <a href="https://soundcloud.com/neotonics-gummies-reviews/neotonics-gummies-reviews2023-update-ingredients-side-effects-negative-complaints">Neotonics</a> is a remarkable product because it contains powerful and high-quality ingredients with a balanced composition. This provides Neotonics with a balanced combination of nutrients that support gut health. Your digestive system will benefit from probiotics, helping your skin age more gracefully.</p>
<p style="text-align: justify;">You should really see a change within a week to a month. You can heal yourself from the inside by taking care of your digestive system. <a href="https://neotonics-update.clubeo.com/page/neotonics-gummies-reviews-explained-2023-best-skin-gut-health-products-reviewed-neotonics-ingredients-cost.html">Neotonics</a> are completely safe, risk-free and reliable. Get supplements immediately if you have skin and stomach problems.</p>
<p style="text-align: justify;"><strong><u><a href="https://sale365day.com/get-neotonics-gummies" target="_blank" rel="nofollow noopener"><span style="font-size: large;">Bumper OFFER 2023 | Read Neotonics Reviews</span></a></u></strong></p>
<h2 style="text-align: justify;"><strong>What are the ingredients in Neotonics Gummies?</strong></h2>
<ul style="text-align: justify;">
<li><strong>Babshi</strong>: It is an herb that can lighten dark spots and improve the overall appearance of the skin. Skin becomes plumper and smoother due to increased collagen synthesis. In addition, this substance contributes to the renewal of the skin.<br /><br /></li>
<li><strong>Dandelion</strong>: Dandelion is used to aid digestion by stimulating the appetite. It has antioxidant qualities. The presence of inulin in the gastrointestinal tract reduces cholesterol absorption. In addition, it promotes satiety. These two Neotonics ingredients are beneficial probiotics that help protect the skin from free radicals.<br /><br /></li>
<li><strong>Bacillus coagulants</strong>: Taking Bacillus Coagulans can help your body's beneficial bacteria grow. As a result, the stomach microbiome can change.<br /><br /></li>
<li><strong>Alfalfa herb</strong>: Fenugreek seeds contain a lot of antioxidants. In fact, it's a great moisturizer. It has a positive effect on the digestive system and lowers blood pressure well.<br /><br /></li>
<li><strong>Citrus Nourishing Oil</strong>: Lemon oil has a number of skin-friendly properties, including the ability to make pores smaller and skin more supple. This Neotonics ingredient is also helpful in relieving pain caused by indigestion. In addition, it also helps to reduce stress and nervousness.<br /><br /></li>
<li><strong>Indian Ginger</strong>: Ceylon ginger can help increase the number of good bacteria in your body. In addition, it also helps to avoid skin damage. Organic Ceylon Ginger restores skin while reducing cellulite and scarring.<br /><br /></li>
<li><strong>Slippery elm bark</strong>: One of the many benefits of slippery elm bark is that it protects the lining of the stomach. In addition, it also helps to reduce discomfort in the digestive tract. Lion's Mane Its primary goal is to improve overall gut health. It also helps fight depression by promoting the growth of new brain cells.</li>
</ul>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://sale365day.com/get-neotonics-gummies"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEirrxLADaQSCo2bTMKnF2qWfM71m3WxTVicnwa95SazqGg69Z1FIOjBM-0IqOiOBhWYcZrS6wzv2UioU5yg2YH5CMa-Zynsz6IU_ONMBfOkRAM15kchBJK5nD-YvPxSBTb6CjepABsaeEDiykF-PhkIDlTwGglLskwIgtXnsiaL50nhKqa8MYGP_vxf/w640-h448/qwdfwff.JPG" alt="" width="640" height="448" border="0" data-original-height="515" data-original-width="736" /></a></div>
<p style="text-align: justify;"><strong><u><a href="https://sale365day.com/get-neotonics-gummies" target="_blank" rel="nofollow noopener"><span style="font-size: large;">Exclusive Offer – Get Neotonics for an unbelievable low price today</span></a></u></strong></p>
<h2 style="text-align: justify;"><strong>What are the benefits in Neotonics gummies?</strong></h2>
<ul style="text-align: justify;">
<li><strong>Promotes healthy skin</strong>: <a href="https://neotonicsgummiesreviews.godaddysites.com/">Neotonics</a> are made with skin-boosting ingredients that can nourish your skin and boost radiance. These gummies promote skin cell regeneration, helping users look younger and slowing down the aging process.<br /><br /></li>
<li><strong>Reduce skin problems:</strong> Your skin problems will disappear if you use <a href="https://neotonicsgummiesreviews.contently.com/">Neotonics</a> daily within a few days. With the use of this nutritional supplement, you can renew your skin and eliminate fine lines, wrinkles and dark spots. Get fresh and healthy cells that help you fight aging and look younger.<br /><br /></li>
<li><strong>Eliminate intestinal problems:</strong> <a href="https://sketchfab.com/3d-models/neotonics-gummies-reviews-scam-2023-is-it-work-fb7fc99c1b94443583b410b050c715e7">Neotonics</a> contains 500 million units of probiotics designed to improve your gut microbiome. In just a few days, this nutritional supplement changes the gut microbiome and increases beneficial bacteria. It helps to grow beneficial bacteria and get rid of harmful bacteria that can cause chaos in your stomach. These gummies improve your body's ability to absorb nutrients, helping you fight aging skin and get a radiant look.<br /><br /></li>
<li><strong>Digestive system support:</strong> Your digestive system is soothed and cleansed with Neotonics. You become more frequent and your inflammation and irritation decrease. <a href="https://www.protocols.io/view/neotonics-gummies-reviews-exposed-by-consumer-repo-czamx2c6">Neotonics</a> help improve your digestive health. Your body functions properly when food is prepared properly. <br /><br /></li>
<li><strong>Promote collagen production:</strong> The <a href="https://www.townscript.com/e/neotonics-reviews-shocking-truth-alert-dont-buy-till-you-read-this-report-023244">Neotonics</a> formula is rich in important vitamins, minerals and amino acids that help stimulate the body's collagen production. Collagen is essential for improving skin suppleness, hydration, and general health. In addition, collagen can reduce wrinkles and skin roughness. Multiple Neotonics reviews have shown that this supplement has helped users retain moisture in their skin for a longer period of time. <br /><br /></li>
<li><strong>Calm ability:</strong> Serotonin present in ginger helps to reduce stress and anxiety. Some of the ingredients that help reduce stress, anxiety and thus reduce the risk of depression are lion's mane, fennel, fenugreek and lemon balm.<br /><br /></li>
<li><strong>Check blood sugar:</strong> <a href="https://www.fuzia.com/fz/neotonics-gummies-reviews">Neotonics</a>, an anti-aging skin support supplement, contains ingredients that have been shown to have a positive effect on blood sugar levels in the body. Organic herbs work by reducing hunger and the body's ability to absorb carbohydrates and sugars.<br /><br /></li>
<li><strong>General health:</strong> Many healthy vitamins and minerals are found in Neotonics gummies. This is great because it will make you walk stronger.</li>
</ul>
<p style="text-align: justify;"><strong><u><a href="https://sale365day.com/get-neotonics-gummies" target="_blank" rel="nofollow noopener"><span style="font-size: large;">Click to buy Neotonics today from the company’s official website! </span></a></u></strong></p>
<h2 style="text-align: justify;"><strong>PROS of Consuming Neotonics Gummies</strong></h2>
<ul style="text-align: justify;">
<li>It contains extracts of nine powerful plants and herbs.</li>
<li>It contains 500 million units of super powerful bacteria to support healthy digestion.</li>
<li>A 60-day money-back guarantee is included.</li>
<li>Soy, gluten and dairy are not included in these marshmallows.</li>
<li><a href="https://neotonics-latest.clubeo.com/calendar/2023/08/28/neotonics-reviews-alert-honest-buyer-beware-consumer-warning-2023-update">Neotonics</a> contain no artificial or chemical substances.</li>
<li>In addition, it provides two additional features for FREE eBooks.</li>
<li>Since it does not contain stimulants, it is not addictive.</li>
<li>Neotonics gummy bottles are easy to carry and easy to swallow.</li>
</ul>
<h2 style="text-align: justify;"><strong>CONS of Consuming Neotonics Gummies</strong></h2>
<ul style="text-align: justify;">
<li>No other physical businesses or <u><a href="https://www.medicalifit.com/recommends/get-neotonics/" target="_blank" rel="nofollow noopener">websites</a></u> sell Neotonics.</li>
<li>The effects of Neotonics can vary from person to person.</li>
</ul>
<p style="text-align: justify;"><strong><u><a href="https://sale365day.com/get-neotonics-gummies" target="_blank" rel="nofollow noopener"><span style="font-size: large;">[BEST OFFER TODAY]: Click to order Neotonics Gummies </span></a></u></strong></p>
<h2 style="text-align: justify;"><strong>Is Neotonics gummies safe to use or any side effects?</strong></h2>
<p style="text-align: justify;">According to the manufacturers, <a href="https://devfolio.co/@neotonicsgummie">Neotonics</a> is created for all women, regardless of age or health problems. The ingredients in this supplement have been clinically proven to be safe. In addition, the ingredients are systematically tested for their effectiveness and purity. This implies that the safety of Neotonics is guaranteed free of contaminants or toxins.</p>
<p style="text-align: justify;">Neotonics have been used by over 170,000 consumers and no one has complained of any side effects from consuming these delicacies. As a result, we can safely say that this is one of the purest gastrointestinal and skin supplements money can buy. </p>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://sale365day.com/get-neotonics-gummies"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjdDzoms0IR1POxNLfwrgVy_B0ps1rp1kCk2APRgGv-4fpNeN9MOLYkg3W4ieFIZ44tr2xOoy3ZKDEtolhVi8zX9IkTLUkm8o0G6VNxD-vxKHoWxlnS8k1Ma1R1laCM5Vl_mT0oH6W4r9gP8g19Gaq_-bRXuzkNd0aG4nYgT5QMzAMaXa_mk-7vOnIV/w640-h310/efffeff.JPG" alt="" width="640" height="310" border="0" data-original-height="451" data-original-width="929" /></a></div>
<p style="text-align: justify;"><strong><u><a href="https://sale365day.com/get-neotonics-gummies" target="_blank" rel="nofollow noopener"><span style="font-size: large;">(Price Drop Alert) Click to Buy Neotonics For As Low As $39/ Bottle</span></a></u></strong></p>
<h2 style="text-align: justify;"><strong>Who can use Neotonics gummies for skin & gut?</strong></h2>
<p style="text-align: justify;">Anyone with digestive or skin problems can take <a href="https://neotonics-update.clubeo.com/page/neotonics-gummies-reviews-explained-2023-best-skin-gut-health-products-reviewed-neotonics-ingredients-cost.html">Neotonics</a>. This medicine supports your digestive system while improving your stomach flora. It increases the generation of healthy new cells that can make your skin look radiant and youthful.</p>
<p style="text-align: justify;">Men and women between the ages of 18 and 80 can use <a href="https://yourpillsboss.blogspot.com/2023/08/neotonics-gummies-reviews-1-skin-gut.html">Neotonics</a> gum. Pregnant women, nursing mothers and anyone with pre-existing medical conditions should not eat it. Thirty marshmallows are included with each bottle of Neotonics. You must consume one candy per day. Without consulting a doctor, keep your Neotonics consumption within the recommended limits.</p>
<h2 style="text-align: justify;"><strong>What is the price for Neotonics Gummies?</strong></h2>
<ul style="text-align: justify;">
<li><strong>Get 30 Days Supplies of Neotonics Gummies</strong></li>
</ul>
<p style="text-align: justify;">This 30-day <a href="https://neotonics-update.clubeo.com/">Neotonics</a> set costs $69 a bottle. There are no shipping fees and simple one-time payments can be made with a variety of cards including MasterCard, Visa, Discover and others.</p>
<ul style="text-align: justify;">
<li><strong>Get 90 Days Supplies of Neotonics Gummies</strong></li>
</ul>
<p style="text-align: justify;">This Neotonics 90-day supply pack is considered the most popular combination. You can buy it for $177 or $59 per bottle. You get two more products for free along with free shipping on the package.</p>
<ul style="text-align: justify;">
<li><strong>Get 180 Days Supplies of Neotonics Gummies</strong></li>
</ul>
<p style="text-align: justify;">This <a href="https://neotonics-gummies-reviews.jimdosite.com/">Neotonics</a> combination, called the 180 Days’ Supply Pack, offers the greatest value. You'll pay $294 for it, or $49 per bottle. Similar to the last offer, free shipping and additional freebies are also included here.</p>
<p style="text-align: justify;"><strong><u><a href="https://sale365day.com/get-neotonics-gummies" target="_blank" rel="nofollow noopener"><span style="font-size: large;">(Buy directly) To purchase Neotonics from the official sales page</span></a></u></strong></p>
<h2 style="text-align: justify;"><strong>What is the bonus included in Neotonics Gummies?</strong></h2>
<ul style="text-align: justify;">
<li><strong>The Great Hair Reset: How to Grow Thick, Full and Lustrous Locks:</strong></li>
</ul>
<p style="text-align: justify;">Who doesn't want thick, shiny hair? This supplement book contains simple tips on how to improve the quality of your hair and look your best on any social occasion.</p>
<ul style="text-align: justify;">
<li><strong>Cellulite Be Gone: How to Banish Cellulite Naturally & Effectively at Home:</strong></li>
</ul>
<p style="text-align: justify;">The ingredients are all included in this book. To look beautiful, you don't have to expose your skin to harmful chemicals. You can get rid of it by following the organic recipes in this book.</p>
<h2 style="text-align: justify;"><strong>What is the refund policy in Neotonics gummies?</strong></h2>
<p style="text-align: justify;">Correct. While the creators of <a href="https://groups.google.com/g/neotonics-gummies-reviews/c/BK00mMI-D0E">Neotonics</a> are sure that their marshmallows will dramatically improve gut health and skin, if customers don't get any benefits, they can claim a refund. A 60-day 100% satisfaction guarantee is included with each bottle. This means that you must return the empty or full bottle of Neotonics gum to the manufacturer and request a refund if you are not completely satisfied with the results. </p>
<p style="text-align: justify;"><strong><u><a href="https://sale365day.com/get-neotonics-gummies" target="_blank" rel="nofollow noopener"><span style="font-size: large;">EXCLUSIVE DEAL: Buy Neotonics Gummies at the Lowest Cost Today</span></a></u></strong></p>
<h2 style="text-align: justify;"><strong>Neotonics Skin & Gut customer reviews:</strong></h2>
<ul style="text-align: justify;">
<li><strong>Sony ratings </strong>- I never imagined that my skin could be so beautiful. Two months ago, I wouldn't have believed it if you told me that a simple remedy could get rid of my dark spots and wrinkles. I'm glad I decided to try this.<br /><br /></li>
<li><strong>Christina said -</strong> I buy creams, serums and lotions for thousands of dollars. And they made no effort to help me in any way. It would have been more beneficial if I had been informed of this tactic earlier. I've also managed to drop three skirt sizes and get rid of my acne. I begged my friends to remove their makeup and give this a try.<br /><br /></li>
<li><strong>Wilson says -</strong> I've been taking <a href="https://groups.google.com/g/neotonics-gummies-reviews/c/uLuJ0KwAyZc">Neotonics</a> for almost six months now. I am very happy with my purchase and intend to continue using it.<br /><br /></li>
<li><strong>Jennifer said -</strong> “What a great company and great product! “With this blend of <a href="https://soundcloud.com/neotonics-gummies-reviews/neotonics-gummies-reviews2023-update-ingredients-side-effects-negative-complaints">Neotonics</a>, I have achieved exceptional results. Now I can go back to my diet because my belly and skin don't bother me anymore. This time I will definitely succeed because I have found the ideal team to accompany me.</li>
</ul>
<h2 style="text-align: justify;"><strong>Neotonics Skin & Gut Reviews – The Conclusion</strong></h2>
<p style="text-align: justify;"><a href="https://groups.google.com/g/neotonics-gummies-reviews">Neotonics</a> is a delicious gum made entirely of natural ingredients and can help with gut and skin health. It has been professionally proven safe and effective as it is made up of 500 million units of good bacteria and 9 powerful botanical ingredients.</p>
<p style="text-align: justify;">After thoroughly reviewing <a href="https://neotonicsgummiesreviews.bandcamp.com/track/neotonics-gummies-reviews-breaking-news-shocking-customer-complaints-exposed-must-read">Neotonics</a>, we can confidently say that it can help you delay the onset of aging while improving skin suppleness and hydration. So your money and patience is worth it. Before starting to consume these marshmallows, we advise our readers to speak with their doctor.</p>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://sale365day.com/get-neotonics-gummies"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh4MxNR1Z0JuVWsfu7hUyN8NvO9CrlJ8vmpSQPqgFTFHzUiJby14KTNmiRr0_EeSpRW5rlkLDvbkCPOq8oNP9xpYs3dLtDC5Ou1TxTBkgXJy0CB2gvkL1YVTBRpMoOY8gbuul3o2VDaBw2vzG_hMyKPt1ncTYXXez5b82RA_a8dZvw9IdvqALomIIC8/w640-h592/wdfwfdfwef.JPG" alt="" width="640" height="592" border="0" data-original-height="556" data-original-width="601" /></a></div>
<p style="text-align: justify;"><strong><u><a href="https://sale365day.com/get-neotonics-gummies" target="_blank" rel="nofollow noopener"><span style="font-size: large;">Check The Availability Of Neotonics Gummies On The Official Website</span></a></u></strong></p>
<h2 style="text-align: justify;"><strong>Frequently Asked Questions – Neotonics Reviews</strong></h2>
<p style="text-align: justify;"><strong>Why are Neotonics beneficial?</strong></p>
<p style="text-align: justify;">Customers taking this supplement can address the main cause of aging and improve gut health. By speeding up cell turnover, users can ensure that their stomachs receive enough probiotic bacteria to keep their bodies functioning properly.</p>
<p style="text-align: justify;"><strong>Why do Neotonics work?</strong></p>
<p style="text-align: justify;">Thanks to natural substances and beneficial bacteria that fight harmful bacteria that accumulate in the stomach, <a href="https://neotonics-gummies-reviews.hashnode.dev/neotonics-gummies-reviews-2023-is-it-legit-or-shocking-customer-controversy-dosage-ingredients-side-effects-exposed">Neotonics</a> promote intestinal homeostasis. To improve health outcomes, <a href="https://www.provenexpert.com/neotonics3/">Neotonics</a> also balance the user's immune system.</p>
<p style="text-align: justify;"><strong>What ingredients make up Neotonics?</strong></p>
<p style="text-align: justify;">Babchi, Inulin, Dandelion Root, Bacillus Coagulans, Fenugreek, Lemon Balm, Organic Ceylon Ginger, Slippery Elm Bark, Organic Lion's Mane and Fennel are all included in each capsule for customer support. These compounds have all been studied for the benefits they can provide for overall health.</p>
<p style="text-align: justify;"><strong>Do Neotonics cause side effects?</strong></p>
<p style="text-align: justify;">ARE NOT. Scientific research supports the effectiveness of the all-natural ingredients used in this product. Although people with already healthy digestive systems will not see much of an effect, the manufacturers are careful to only use a safe amount of each ingredient.</p>
<p style="text-align: justify;"><strong>How should Neotonics be taken?</strong></p>
<p style="text-align: justify;">Users who want to improve their digestion and skin should only consume one candy per day. To get the support you need, no extra food or liquids are needed.</p>
<p><strong><u><a href="https://sale365day.com/get-neotonics-gummies" target="_blank" rel="nofollow noopener"><span style="font-size: large;">Exclusive Offer – Get Neotonics for an unbelievable low price today</span></a></u></strong></p>
<p style="text-align: justify;"><a href="https://colab.research.google.com/drive/15NmvvbrD78f0uRj4fGrO51f11TTRBrsF">Neotonics</a> products use probiotic bacteria and all-natural substances that can help people regain balance and eliminate all toxins in the stomach. While responding to the rapid aging of the skin, this balance also addresses the gut microbiome, the main contributor to this rapid aging. The gut microbiota has a fundamental beneficial function in supporting the immune system and cellular health. The only way for a client to have a healthy immune system, youthful skin and a perfectly balanced digestive system is to figure out how to improve the environment when it is out of balance. </p>
<p><strong>Read More:</strong></p>
<p><a href="https://yourpillsboss.blogspot.com/2023/08/neotonics-gummies-reviews-1-skin-gut.html">https://yourpillsboss.blogspot.com/2023/08/neotonics-gummies-reviews-1-skin-gut.html</a><br /><a href="https://neotonics-gummies-reviews.jimdosite.com/">https://neotonics-gummies-reviews.jimdosite.com/</a><br /><a href="https://groups.google.com/g/neotonics-gummies-reviews">https://groups.google.com/g/neotonics-gummies-reviews</a><br /><a href="https://groups.google.com/g/neotonics-gummies-reviews/c/BK00mMI-D0E">https://groups.google.com/g/neotonics-gummies-reviews/c/BK00mMI-D0E</a><br /><a href="https://groups.google.com/g/neotonics-gummies-reviews/c/uLuJ0KwAyZc">https://groups.google.com/g/neotonics-gummies-reviews/c/uLuJ0KwAyZc</a><br /><a href="https://colab.research.google.com/drive/15NmvvbrD78f0uRj4fGrO51f11TTRBrsF">https://colab.research.google.com/drive/15NmvvbrD78f0uRj4fGrO51f11TTRBrsF</a><br /><a href="https://neotonics-gummies-reviews-2023.webflow.io/">https://neotonics-gummies-reviews-2023.webflow.io/</a><br /><a href="https://devfolio.co/@neotonicsgummie">https://devfolio.co/@neotonicsgummie</a><br /><a href="https://neotonics-update.clubeo.com/page/neotonics-gummies-reviews-explained-2023-best-skin-gut-health-products-reviewed-neotonics-ingredients-cost.html">https://neotonics-update.clubeo.com/page/neotonics-gummies-reviews-explained-2023-best-skin-gut-health-products-reviewed-neotonics-ingredients-cost.html</a><br /><a href="https://neotonics-update.clubeo.com/">https://neotonics-update.clubeo.com/</a><br /><a href="https://soundcloud.com/neotonics-gummies-reviews/neotonics-gummies-reviews2023-update-ingredients-side-effects-negative-complaints">https://soundcloud.com/neotonics-gummies-reviews/neotonics-gummies-reviews2023-update-ingredients-side-effects-negative-complaints</a><br /><a href="https://neotonicsgummiesreviews.bandcamp.com/track/neotonics-gummies-reviews-breaking-news-shocking-customer-complaints-exposed-must-read">https://neotonicsgummiesreviews.bandcamp.com/track/neotonics-gummies-reviews-breaking-news-shocking-customer-complaints-exposed-must-read</a><br /><a href="https://lookerstudio.google.com/reporting/5ce1a1c5-664e-46fd-aecc-1cfec6b95a8e/page/LG7aD">https://lookerstudio.google.com/reporting/5ce1a1c5-664e-46fd-aecc-1cfec6b95a8e/page/LG7aD</a><br /><a href="https://www.facebook.com/profile.php?id=61550985140645">https://www.facebook.com/profile.php?id=61550985140645</a><br /><a href="https://infogram.com/neotonics-gummies-reviews-beware-scam-alert-is-it-works-or-fake-claim-1ho16vo938nyx4n">https://infogram.com/neotonics-gummies-reviews-beware-scam-alert-is-it-works-or-fake-claim-1ho16vo938nyx4n</a><br /><a href="https://medium.com/@neotonicsgummies/neotonics-gummies-reviews-scam-analyzed-and-exposed-by-medical-experts-real-user-responses-2023-ba6cf7904a5e">https://medium.com/@neotonicsgummies/neotonics-gummies-reviews-scam-analyzed-and-exposed-by-medical-experts-real-user-responses-2023-ba6cf7904a5e</a><br /><a href="https://medium.com/@neotonicsgummies">https://medium.com/@neotonicsgummies</a><br /><a href="https://www.forexagone.com/forum/experiences-trading/neotonics-gummies-reviews-formulated-with-100-pure-ingredients-that-helps-in-skin-and-gut-health-64727#161915">https://www.forexagone.com/forum/experiences-trading/neotonics-gummies-reviews-formulated-with-100-pure-ingredients-that-helps-in-skin-and-gut-health-64727#161915</a><br /><a href="https://www.townscript.com/e/neotonics-reviews-shocking-truth-alert-dont-buy-till-you-read-this-report-023244">https://www.townscript.com/e/neotonics-reviews-shocking-truth-alert-dont-buy-till-you-read-this-report-023244</a><br /><a href="https://www.fuzia.com/fz/neotonics-gummies-reviews">https://www.fuzia.com/fz/neotonics-gummies-reviews</a><br /><a href="https://www.protocols.io/blind/1A72492A465511EEB7930A58A9FEAC02">https://www.protocols.io/blind/1A72492A465511EEB7930A58A9FEAC02</a><br /><a href="https://sketchfab.com/3d-models/neotonics-gummies-reviews-scam-2023-is-it-work-fb7fc99c1b94443583b410b050c715e7">https://sketchfab.com/3d-models/neotonics-gummies-reviews-scam-2023-is-it-work-fb7fc99c1b94443583b410b050c715e7</a><br /><a href="https://neotonicsgummiesreviews.contently.com/">https://neotonicsgummiesreviews.contently.com/</a><br /><a href="https://neotonicsgummiesreviews.godaddysites.com/">https://neotonicsgummiesreviews.godaddysites.com/</a><br /><a href="https://www.provenexpert.com/neotonics3/">https://www.provenexpert.com/neotonics3/</a><br /><a href="https://neotonics-gummies-reviews.hashnode.dev/neotonics-gummies-reviews-2023-is-it-legit-or-shocking-customer-controversy-dosage-ingredients-side-effects-exposed">https://neotonics-gummies-reviews.hashnode.dev/neotonics-gummies-reviews-2023-is-it-legit-or-shocking-customer-controversy-dosage-ingredients-side-effects-exposed</a><br /><a href="https://hashnode.com/@neotonicsgummiesrevi">https://hashnode.com/@neotonicsgummiesrevi</a><br /><a href="https://neotonics-latest.clubeo.com/calendar/2023/08/28/neotonics-reviews-alert-honest-buyer-beware-consumer-warning-2023-update">https://neotonics-latest.clubeo.com/calendar/2023/08/28/neotonics-reviews-alert-honest-buyer-beware-consumer-warning-2023-update</a><br /><a href="https://devfolio.co/@neotonicsreview">https://devfolio.co/@neotonicsreview</a><br /><a href="https://neotonicsgummiesreview.contently.com/">https://neotonicsgummiesreview.contently.com/</a><br /><a href="https://neotonics-gummies-reviews-usa-lab-teste.webflow.io/">https://neotonics-gummies-reviews-usa-lab-teste.webflow.io/</a><br /><a href="https://sketchfab.com/3d-models/neotonics-gummies-reviews-caution-pros-cons-5a17b7c491884cf8a856a6e21c856194">https://sketchfab.com/3d-models/neotonics-gummies-reviews-caution-pros-cons-5a17b7c491884cf8a856a6e21c856194</a><br /><a href="https://neotonicsgummiesreviewsscamale.godaddysites.com/">https://neotonicsgummiesreviewsscamale.godaddysites.com/</a><br /><a href="https://www.protocols.io/blind/0958FDA2465511EEB7930A58A9FEAC02">https://www.protocols.io/blind/0958FDA2465511EEB7930A58A9FEAC02</a><br /><a href="https://neotonicsgummiesreviewsusa.bandcamp.com/track/neotonics-gummies-reviews-shocking-customer-complaints-exposed-2023-negative-side-effects-for-customer-risk">https://neotonicsgummiesreviewsusa.bandcamp.com/track/neotonics-gummies-reviews-shocking-customer-complaints-exposed-2023-negative-side-effects-for-customer-risk</a><br /><a href="https://www.forexagone.com/forum/analyse-graphique/neotonics-gummies-reviews-scam-exposed-nobody-tells-you-the-100-truth-about-neotonics-64809#161997">https://www.forexagone.com/forum/analyse-graphique/neotonics-gummies-reviews-scam-exposed-nobody-tells-you-the-100-truth-about-neotonics-64809#161997</a><br /><a href="https://www.townscript.com/e/neotonics-reviews-scam-alert-2023-do-this-buying-or-fake-complaints-314040">https://www.townscript.com/e/neotonics-reviews-scam-alert-2023-do-this-buying-or-fake-complaints-314040</a><br /><a href="https://neotonicsgummiesreviewsscam.bandcamp.com/track/neotonics-gummies-reviews-shocking-responses-2023-negative-side-effects-risk-for-consumers">https://neotonicsgummiesreviewsscam.bandcamp.com/track/neotonics-gummies-reviews-shocking-responses-2023-negative-side-effects-risk-for-consumers</a><br /><a href="https://sketchfab.com/3d-models/neotonics-gummies-reviews-2023-scam-alert-works-301d2140591f4c0f9cb3126ac11e9fe4">https://sketchfab.com/3d-models/neotonics-gummies-reviews-2023-scam-alert-works-301d2140591f4c0f9cb3126ac11e9fe4</a><br /><a href="https://www.forexagone.com/forum/matieres-premieres/neotonics-gummies-reviews-fraudulent-exposed-2023-scam-exposed-does-it-work-64837#162025">https://www.forexagone.com/forum/matieres-premieres/neotonics-gummies-reviews-fraudulent-exposed-2023-scam-exposed-does-it-work-64837#162025</a></p> |
zxvix/pubmed_physics | 2023-08-29T12:36:20.000Z | [
"region:us"
] | zxvix | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: MedlineCitation
struct:
- name: PMID
dtype: int32
- name: DateCompleted
struct:
- name: Year
dtype: int32
- name: Month
dtype: int32
- name: Day
dtype: int32
- name: NumberOfReferences
dtype: int32
- name: DateRevised
struct:
- name: Year
dtype: int32
- name: Month
dtype: int32
- name: Day
dtype: int32
- name: Article
struct:
- name: Abstract
struct:
- name: AbstractText
dtype: string
- name: ArticleTitle
dtype: string
- name: AuthorList
struct:
- name: Author
sequence:
- name: LastName
dtype: string
- name: ForeName
dtype: string
- name: Initials
dtype: string
- name: CollectiveName
dtype: string
- name: Language
dtype: string
- name: GrantList
struct:
- name: Grant
sequence:
- name: GrantID
dtype: string
- name: Agency
dtype: string
- name: Country
dtype: string
- name: PublicationTypeList
struct:
- name: PublicationType
sequence: string
- name: MedlineJournalInfo
struct:
- name: Country
dtype: string
- name: ChemicalList
struct:
- name: Chemical
sequence:
- name: RegistryNumber
dtype: string
- name: NameOfSubstance
dtype: string
- name: CitationSubset
dtype: string
- name: MeshHeadingList
struct:
- name: MeshHeading
sequence:
- name: DescriptorName
dtype: string
- name: QualifierName
dtype: string
- name: PubmedData
struct:
- name: ArticleIdList
sequence:
- name: ArticleId
sequence: string
- name: PublicationStatus
dtype: string
- name: History
struct:
- name: PubMedPubDate
sequence:
- name: Year
dtype: int32
- name: Month
dtype: int32
- name: Day
dtype: int32
- name: ReferenceList
sequence:
- name: Citation
dtype: string
- name: CitationId
dtype: int32
- name: text
dtype: string
- name: title
dtype: string
- name: original_text
dtype: string
splits:
- name: test
num_bytes: 4045589.405
num_examples: 995
download_size: 2215468
dataset_size: 4045589.405
---
# Dataset Card for "pubmed_physics"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
recursix/geo-bench-test | 2023-08-29T12:45:03.000Z | [
"region:us"
] | recursix | null | null | null | 0 | 0 | Entry not found |
yzhuang/autotree_automl_bank-marketing_sgosdt_l256_d3_sd0 | 2023-08-30T13:22:09.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float32
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 174960000
num_examples: 10000
- name: validation
num_bytes: 174960000
num_examples: 10000
download_size: 72788389
dataset_size: 349920000
---
# Dataset Card for "autotree_automl_bank-marketing_sgosdt_l256_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
JamalSQ/JamalSQLab | 2023-08-30T09:12:39.000Z | [
"task_categories:text-generation",
"task_categories:token-classification",
"task_categories:text2text-generation",
"task_categories:question-answering",
"size_categories:100K<n<1M",
"language:aa",
"language:sr",
"language:en",
"license:osl-3.0",
"code",
"chemistry",
"legal",
"not-for-all-aud... | JamalSQ | null | null | null | 0 | 0 | ---
license: osl-3.0
task_categories:
- text-generation
- token-classification
- text2text-generation
- question-answering
language:
- aa
- sr
- en
tags:
- code
- chemistry
- legal
- not-for-all-audiences
- finance
pretty_name: Sherminator
size_categories:
- 100K<n<1M
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
AnimaleMaleEnhancementZA/AnimaleMaleEnhancementSouthAfrica | 2023-08-29T13:02:39.000Z | [
"region:us"
] | AnimaleMaleEnhancementZA | null | null | null | 0 | 0 | <h3><span style="background-color: #ffff00;"><strong>Our Official Facebook Links ⇒</strong></span></h3>
<p><a href="https://www.facebook.com/AnimaleCBDGummiesZA/"><strong>https://www.facebook.com/AnimaleCBDGummiesZA/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementSouthAfricaBuy/"><strong>https://www.facebook.com/AnimaleMaleEnhancementSouthAfricaBuy/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesSouthAfrica/"><strong>https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesSouthAfrica/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092203209665/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092203209665/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092239339335/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092239339335/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092099453174/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092099453174/</strong></a></p>
<p><a href="https://www.facebook.com/events/1121615602562904/"><strong>https://www.facebook.com/events/1121615602562904/</strong></a></p>
<p><a href="https://www.facebook.com/events/1295846104688434/"><strong>https://www.facebook.com/events/1295846104688434/</strong></a></p>
<p><a href="https://www.facebook.com/events/1429727191099071/"><strong>https://www.facebook.com/events/1429727191099071/</strong></a></p>
<p> </p>
<h3><span style="font-weight: 400;">➥ Product Name — </span><span style="color: #800000;"><a style="color: #800000;" href="https://www.facebook.com/AnimaleMaleEnhancementSouthAfricaBuy/"><strong>Animale Male Enhancement</strong></a></span><span style="font-weight: 400;"> </span></h3>
<h3><span style="font-weight: 400;">➥ Country — </span><span style="color: #800000;"><a style="color: #800000;" href="https://www.facebook.com/AnimaleCBDGummiesZA/"><strong>South Africa</strong></a></span></h3>
<h3><span style="font-weight: 400;">➥ Main Benefits — </span><span style="color: #800000;"><a style="color: #800000;" href="https://healthcare24hrs.com/animal-male-enhancement-south-africa"><strong>Male Enhancement</strong></a></span></h3>
<h3><span style="font-weight: 400;">➥ Rating —</span> <span style="color: #800000;"><a style="color: #800000;" href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesSouthAfrica/"><strong>5.0/5.0</strong></a></span><span style="font-weight: 400;"> ⭐⭐⭐⭐⭐</span></h3>
<h3><span style="font-weight: 400;">➥ Results — </span><span style="color: #800000;"><a style="color: #800000;" href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092203209665/"><strong>In 1-3 Months</strong></a></span></h3>
<h3><span style="font-weight: 400;">➥ Availability — </span><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-south-africa"><strong>Online</strong></a></span></h3>
<h3><span style="font-weight: 400;">➥ Side Effects — </span><span style="color: #800000;"><a style="color: #800000;" href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092239339335/"><strong>No Major Side Effects</strong></a></span></h3>
<h3><span style="font-weight: 400;">➥ Official Website (Sale Is Live) — </span><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-south-africa"><strong>Click Here To Order Animale Male Enhancement South Africa</strong></a></span></h3>
<p> </p>
<p><span style="font-weight: 400;">There are many ways to access the system for sexual health care. </span><a href="https://www.facebook.com/AnimaleMaleEnhancementSouthAfricaBuy/"><strong>Sexual health</strong></a><span style="font-weight: 400;"> is best during your 20s, and your sexual peak is between the ages of 20 and 50. As we get older, our performance declines. However, the ability to perform at the highest level possible stays the same. As we get older, our sexual desire decreases, and further sexual health deteriorates, which makes us less attractive in bed.</span></p>
<p> </p>
<h2><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-south-africa"><strong>>>>>>>>>=====CLICK HERE TO BUY & BOOST YOUR SEX POWER=====<<<<<<<<<<</strong></a></span></h2>
<h2><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-south-africa"><strong>>>>>>>>>=====CLICK HERE TO BUY & BOOST YOUR SEX POWER=====<<<<<<<<<<</strong></a></span></h2>
<h2><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-south-africa"><strong>>>>>>>>>=====CLICK HERE TO BUY & BOOST YOUR SEX POWER=====<<<<<<<<<<</strong></a></span></h2>
<p> </p>
<p><span style="background-color: #ccffff;"><strong>There are many things that men in their fifties and sixties may experience, such as decreased sexual function, but having erections can often be hard.</strong></span></p>
<p> </p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092099453174/"><strong>Animale Male Enhancement South Africa</strong></a><span style="font-weight: 400;"> To enjoy the maximum of your life, it’s essential that you get the right nutrients in your body. Nutrition is the key to achieving maximum health and vitality.</span></p>
<p> </p>
<p><span style="font-weight: 400;">Poor quality of life is another root cause of this problem. People with insomnia have a difficult time coming up with sexual efforts and often feel stuck in their day-to-day life too. </span></p>
<p> </p>
<p><span style="font-weight: 400;">When it comes to sexual life, many men usually turn to over-the-counter medications, but it is important to know that such medicines have some or another side effect or may have only short-term effects on the body. If they have a short-term effect, it may become possible to take the medicine again and again and further take your body towards more exposure to side effects. </span></p>
<p> </p>
<p><a href="https://healthcare24hrs.com/animal-male-enhancement-south-africa"><img src="https://i.ibb.co/S5fL8Gp/Animale-Male-Enhancement-Buy.jpg" alt="Animale-Male-Enhancement-Buy" border="0" /></a></p>
<p> </p>
<p> </p>
<h2><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-south-africa"><strong>>>>>>>>>=====CLICK HERE TO BUY & BOOST YOUR SEX POWER=====<<<<<<<<<<</strong></a></span></h2>
<h2><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-south-africa"><strong>>>>>>>>>=====CLICK HERE TO BUY & BOOST YOUR SEX POWER=====<<<<<<<<<<</strong></a></span></h2>
<h2><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-south-africa"><strong>>>>>>>>>=====CLICK HERE TO BUY & BOOST YOUR SEX POWER=====<<<<<<<<<<</strong></a></span></h2>
<p> </p>
<p><span style="font-weight: 400;">This is why men want to look for a natural way that can permanently solve their sexual decline issues. Here come male enhancement products. You may get numerous supplements online. But it is important to find the right one. </span></p>
<p> </p>
<p><span style="font-weight: 400;">According to our research, </span><a href="https://www.facebook.com/AnimaleMaleEnhancementSouthAfricaBuy/"><strong>Animale Male Enhancement South Africa</strong></a><span style="font-weight: 400;"> is the perfect male enhancement formula that improves your sexual life as well as supports your overall health. </span></p>
<p> </p>
<p><span style="font-weight: 400;">But does it really work? If yes, how and what does it contain? To know more details about the supplement, read this comprehensive review. </span></p>
<p> </p>
<h3><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/AnimaleCBDGummiesZA/"><strong>What is Animale Male Enhancement South Africa?</strong></a></span></h3>
<p> </p>
<p><a href="https://www.facebook.com/AnimaleCBDGummiesZA/"><strong>Animale Male Enhancement South Africa</strong></a><span style="font-weight: 400;"> are the newest, most effective, and healthiest male enhancement gummies on the market today. They’re made with a combination of superfoods such as acai berries, maca root and more to help you regain your sexual life and improve your overall health. These gummies are also gluten-free and non-GMO!</span></p>
<p> </p>
<h2><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-south-africa"><strong>>>>>>>>>=====CLICK HERE TO BUY & BOOST YOUR SEX POWER=====<<<<<<<<<<</strong></a></span></h2>
<h2><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-south-africa"><strong>>>>>>>>>=====CLICK HERE TO BUY & BOOST YOUR SEX POWER=====<<<<<<<<<<</strong></a></span></h2>
<h2><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-south-africa"><strong>>>>>>>>>=====CLICK HERE TO BUY & BOOST YOUR SEX POWER=====<<<<<<<<<<</strong></a></span></h2>
<p> </p>
<p><span style="font-weight: 400;">As we all know, not only women but men are also responsible for conception. So it is important for men to have adequate levels of testosterone, it is a male sexual hormone, to have healthy fertility and sperm count. And the best part is </span><a href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesSouthAfrica/"><strong>Animale Male Enhancement South Africa</strong></a><span style="font-weight: 400;"> is one that can help you achieve your sexual goals without any side effects. Whether you lack testosterone or energy level, everything is taken care of by these gummies. </span></p>
<p> </p>
<p><span style="font-weight: 400;">Additionally, this </span><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092239339335/"><strong>male enhancement formula</strong></a><span style="font-weight: 400;"> is completely safe, there are no harmful chemicals are added. Along with improving hormone levels, the supplement ensures to keep your reproductive parts are healthy by regulating blood circulation. It allows the blood to flow freely through all parts of the reproductive system as well as the body.With its unique formulation, this supplement provides better erections and allow men to have long lasting sexual performance on bed.</span></p>
<p> </p>
<p><span style="font-weight: 400;">Not only this, the supplement is effective to support overall health by reducing inflammation and promoting better blood flow. According to the manufacturer of </span><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092203209665/"><strong>Animale Male Enhancement South Africa</strong></a><span style="font-weight: 400;">, taking these gummies everyday will help you in regaining your sexual desire just like the way you had in your young time. </span></p>
<p> </p>
<p><span style="font-weight: 400;">Also, if you have been diagnosed with erectile dysfunction or poor ejaculation, then it is a time to look for this supplement and adding this in your daily routine is a best thing you could do to have better sexual life. </span></p>
<p> </p>
<p><a href="https://healthcare24hrs.com/animal-male-enhancement-south-africa"><img src="https://i.ibb.co/0XJxPwq/Animale-Male-Enhancement-Order.jpg" alt="Animale-Male-Enhancement-Order" border="0" /></a></p>
<p> </p>
<p> </p>
<h2><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-south-africa"><strong>>>>>>>>>=====CLICK HERE TO BUY & BOOST YOUR SEX POWER=====<<<<<<<<<<</strong></a></span></h2>
<h2><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-south-africa"><strong>>>>>>>>>=====CLICK HERE TO BUY & BOOST YOUR SEX POWER=====<<<<<<<<<<</strong></a></span></h2>
<h2><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-south-africa"><strong>>>>>>>>>=====CLICK HERE TO BUY & BOOST YOUR SEX POWER=====<<<<<<<<<<</strong></a></span></h2>
<p> </p>
<h3><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesSouthAfrica/"><strong>How does Animale Male Enhancement South Africa Work? </strong></a></span></h3>
<h3> </h3>
<p><span style="font-weight: 400;">When we talk about how this </span><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092099453174/"><strong>male enhancement formula</strong></a><span style="font-weight: 400;"> work in the body, the very first thing you must know is it improves the working of reproductive system and contains all natural ingredients which are proven to keep the system healthy. It also includes herbs that has been used by many people for centuries to improve erectile function. </span></p>
<p> </p>
<p><span style="font-weight: 400;">Most ingredients present in the formula are rich in antioxidants which works to lower the stress level and fight against free radical damage. It is a fact that when you have a healthy body, you naturally have better blood flow all over the body. So this is where the </span><a href="https://www.facebook.com/AnimaleMaleEnhancementSouthAfricaBuy/"><strong>Animale Male Enhancement South Africa</strong></a><span style="font-weight: 400;"> work, they keep the reproductive system healthy and improve the blood flow. </span></p>
<p> </p>
<p><span style="font-weight: 400;">When there is better blood circulation, it provides maximum oxygen to the penis and that results in better erections and increased sperm count. </span></p>
<p> </p>
<h3><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092203209665/"><strong>What are the ingredients? </strong></a></span></h3>
<p> </p>
<p><span style="font-weight: 400;">This product contains natural ingredients such as Saw Palmetto, </span><a href="https://www.facebook.com/AnimaleCBDGummiesZA/"><strong>L-Arginine</strong></a><span style="font-weight: 400;">, Yohimbe Extract, Bioperin and so on. It helps to improve blood circulation and boosts sexual stamina. How does it work? It helps in improving blood circulation and enhances the sexual stamina. The product also helps in boosting the energy level of the body. It also increases the production of testosterone in the body.</span></p>
<p> </p>
<p><span style="font-weight: 400;">The ingredients in this supplement are clinically proven to increase stamina, endurance, energy levels, and overall sexual performance. The main ingredient is L-arginine which helps improve the flow of blood in the penis. This supplement has been clinically proven to help increase the volume of semen and the length of erections. It also contains other ingredients such as L-citrulline and </span><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092203209665/"><strong>L-carnitine</strong></a><span style="font-weight: 400;">. </span></p>
<p> </p>
<h2><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-south-africa"><strong>>>>>>>>>=====CLICK HERE TO BUY & BOOST YOUR SEX POWER=====<<<<<<<<<<</strong></a></span></h2>
<h2><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-south-africa"><strong>>>>>>>>>=====CLICK HERE TO BUY & BOOST YOUR SEX POWER=====<<<<<<<<<<</strong></a></span></h2>
<h2><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-south-africa"><strong>>>>>>>>>=====CLICK HERE TO BUY & BOOST YOUR SEX POWER=====<<<<<<<<<<</strong></a></span></h2>
<p> </p>
<p><span style="font-weight: 400;">In order to get a successful erection, a man must be aroused. The nerves that control erections are part of the nervous system. This is where male sexual arousal starts. The brain releases hormones and chemical messengers into the body. These messengers stimulate the nerves in the penis and cause the penis to become engorged with blood. In the first place, you can't just "make" yourself have an erection. An erection is something that occurs naturally inside your body. </span></p>
<p> </p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092239339335/"><strong>For example, A car doesn't get an erection by itself.</strong></a><span style="font-weight: 400;"> It requires outside stimulation before it will work. You need to have some kind of external source of arousal to make the car function. If you want to learn how to have an erection, you'll need to find ways to increase your level of arousal. What makes a man sexually aroused? What makes a man sexually aroused? There are two types of male sexual arousal: involuntary and voluntary. This means that if you are sexually aroused, there are two different things going on. One is a physical response inside your body.</span></p>
<p> </p>
<p><a href="https://healthcare24hrs.com/animal-male-enhancement-south-africa"><img src="https://i.ibb.co/GVtLk96/Animale-Male-Enhancement-Shop.jpg" alt="Animale-Male-Enhancement-Shop" border="0" /></a></p>
<p> </p>
<p> </p>
<h2><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-south-africa"><strong>>>>>>>>>=====CLICK HERE TO BUY & BOOST YOUR SEX POWER=====<<<<<<<<<<</strong></a></span></h2>
<h2><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-south-africa"><strong>>>>>>>>>=====CLICK HERE TO BUY & BOOST YOUR SEX POWER=====<<<<<<<<<<</strong></a></span></h2>
<h2><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-south-africa"><strong>>>>>>>>>=====CLICK HERE TO BUY & BOOST YOUR SEX POWER=====<<<<<<<<<<</strong></a></span></h2>
<p> </p>
<h3><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092099453174/"><strong>Ingredients of Animale Male Enhancement South Africa </strong></a></span></h3>
<p> </p>
<p><span style="font-weight: 400;">Every man is concerned about ED at some point in his life, regardless of his age. Their self-esteem and sexual relations with their partners are adversely affected as a result. Thirty ingredients are included in this formula to aid in maintaining a healthy sexual system.</span></p>
<p> </p>
<p><span style="color: #339966;"><a style="color: #339966;" href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092239339335/"><strong>Quercetin </strong></a></span></p>
<p> </p>
<p><span style="font-weight: 400;">Quercetin is used to lower blood pressure. Users have reported an improvement in their physical performance. There is a cascading effect when blood pressure falls. This results in increased blood flow to the penis.</span></p>
<p> </p>
<p><span style="color: #339966;"><a style="color: #339966;" href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092099453174/"><strong>Glutamate</strong></a></span></p>
<p> </p>
<p><span style="font-weight: 400;">There is a positive impact on neurotransmitters with glutamate. Sexual performance and libido are boosted by these neurotransmitters. Users can benefit from a functioning central nervous system with the help of these neurotransmitters.</span></p>
<p> </p>
<p><span style="color: #339966;"><a style="color: #339966;" href="https://www.facebook.com/AnimaleMaleEnhancementSouthAfricaBuy/"><strong>Saw Palmetto</strong></a></span></p>
<p> </p>
<p><span style="font-weight: 400;">If you're going to lose some weight, Saw Palmetto is the best bet. This supplement helps in the production of testosterone in men. These ideal testosterone levels can be used to maintain a healthy urethra. The regular use of this supplement keeps their erections longer.</span></p>
<p> </p>
<p><span style="color: #339966;"><a style="color: #339966;" href="https://www.facebook.com/AnimaleCBDGummiesZA/"><strong>Pygeum Bark Extract</strong></a></span></p>
<p> </p>
<p><span style="font-weight: 400;">Problems withontinence and night-time urination are common among men with poor prostrate health. In this instance, pygeum bark extract can be used. It helps with the prevention of diseases of the urinary tract.</span></p>
<p> </p>
<p><span style="color: #339966;"><a style="color: #339966;" href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesSouthAfrica/"><strong>Catechine</strong></a></span></p>
<p> </p>
<p><span style="font-weight: 400;">Using this ingredient, you can increase testosterone levels in your body. This aids in the health of the sex organ and the prostate. Additionally, the sexual reproductive system functions better for users.</span></p>
<p> </p>
<p><span style="color: #339966;"><a style="color: #339966;" href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092203209665/"><strong>Vitamin C</strong></a></span></p>
<p> </p>
<p><span style="font-weight: 400;">The immune system is helped by the adeqaute levels of the vitamin C . The body has toxins and waste, which is removed by Vitamin C. It helps prevent the spread of infections. The reproductive organs can be damaged by free radicals. This ingredient helps in the eradication of harmful products.</span></p>
<p> </p>
<p><span style="color: #339966;"><a style="color: #339966;" href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092239339335/"><strong>L-Arginine</strong></a></span></p>
<p> </p>
<p><span style="font-weight: 400;">Fish, poultry, and dairy products are some of the foods that have the L-arginine in them. The body's ability to produceprotein is aided by the presence of L-arginine. The improvement of blood flow is aided by that. Nitric oxide is formed when L-arginine is eaten. When blood vessels dilate, the penis remains erect for a long time.</span></p>
<p> </p>
<h3><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/AnimaleCBDGummiesZA/"><strong>Benefits of Using Animale Male Enhancement South Africa </strong></a></span></h3>
<h3> </h3>
<ul>
<li><span style="font-weight: 400;"> This supplement can increase blood flow to the penis which results n producing nitric oxide that has a vasodilating effect. There is a large amount of nitric oxide produced by the ingredients in this supplement.</span></li>
</ul>
<p> </p>
<ul>
<li><span style="font-weight: 400;"> The Supplement can help reduce stress. In the bedroom, you'll have better results. </span></li>
</ul>
<p> </p>
<ul>
<li><span style="font-weight: 400;">The Supplement improves the mood of those taking it. The Supplement's ability to improve mood, memory, and sleep are attributed to it.</span></li>
</ul>
<p> </p>
<ul>
<li><span style="font-weight: 400;"> A lack of sleep can cause problems. Users can expect a good night's sleep and a productive day from this penis enlarger supplement.</span></li>
</ul>
<p> </p>
<ul>
<li><span style="font-weight: 400;"> Mood and sleep can affect stamina. No one using this supplement is left without energy. Sex is up to it's name.</span></li>
</ul>
<p> </p>
<ul>
<li><span style="font-weight: 400;"> There is a benefit to boosting testosterone production using this supplement. Testosterone acts as a male hormone to block the conversion of DHT to estradiol. It is responsible for enhancing libido and sexual performance is done by this.</span></li>
</ul>
<p> </p>
<ul>
<li><span style="font-weight: 400;"> Increasing testosterone levels can help you lose body fat. Users begin to see an increase in their self-esteem.</span></li>
</ul>
<p> </p>
<ul>
<li><span style="font-weight: 400;"> They can keep up with their diet by taking these gummies.</span></li>
</ul>
<p> </p>
<p><span style="font-weight: 400;">The </span><a href="https://www.facebook.com/AnimaleMaleEnhancementSouthAfricaBuy/"><strong>male enhancement tablet</strong></a><span style="font-weight: 400;"> has no side effects. This supplement is free of harmful chemicals and is produced in a non-GMO environment. No side effects on the body have been considered. It warned that it should only be fed to a person over the age of 18.</span></p>
<p> </p>
<p><a href="https://healthcare24hrs.com/animal-male-enhancement-south-africa"><img src="https://i.ibb.co/R3k68qz/Animale-Male-Enhancement-Sale.jpg" alt="Animale-Male-Enhancement-Sale" border="0" /></a><br /><br /></p>
<p> </p>
<h2><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-south-africa"><strong>>>>>>>>>=====CLICK HERE TO BUY & BOOST YOUR SEX POWER=====<<<<<<<<<<</strong></a></span></h2>
<h2><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-south-africa"><strong>>>>>>>>>=====CLICK HERE TO BUY & BOOST YOUR SEX POWER=====<<<<<<<<<<</strong></a></span></h2>
<h2><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-south-africa"><strong>>>>>>>>>=====CLICK HERE TO BUY & BOOST YOUR SEX POWER=====<<<<<<<<<<</strong></a></span></h2>
<p> </p>
<h3><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesSouthAfrica/"><strong>How much Animale Male Enhancement South Africa should I take?</strong></a></span></h3>
<h3> </h3>
<p><span style="font-weight: 400;">For at least three to six months, two tablets of these gummies should be taken daily. The components of the Supplement are non-habit-forming so they are safe to consume for six months.</span></p>
<p> </p>
<h3><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092203209665/"><strong>Results and longevity of Animale Male Enhancement South Africa</strong></a></span></h3>
<p> </p>
<p><span style="font-weight: 400;">The high-quality development is considered on the body after at least three to six months of continued use of the product. For three to six months, take this Supplement with an acceptable weight loss and exercise regimen. The results may last a while.</span></p>
<p> </p>
<h2><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-south-africa"><strong>>>>>>>>>=====CLICK HERE TO BUY & BOOST YOUR SEX POWER=====<<<<<<<<<<</strong></a></span></h2>
<h2><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-south-africa"><strong>>>>>>>>>=====CLICK HERE TO BUY & BOOST YOUR SEX POWER=====<<<<<<<<<<</strong></a></span></h2>
<h2><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-south-africa"><strong>>>>>>>>>=====CLICK HERE TO BUY & BOOST YOUR SEX POWER=====<<<<<<<<<<</strong></a></span></h2>
<p> </p>
<h3><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092239339335/"><strong>Why Is Animale Male Enhancement South Africa the Right Choice for You?</strong></a></span></h3>
<p> </p>
<p><strong>This supplement will help you restore erectile dysfunction. It's a safe and effective solution to an embarrassing problem.</strong></p>
<p> </p>
<p><span style="font-weight: 400;">The available research shows that its potent combination is safe and effective, with no side effects. It is a natural supplement that has been shown to improve stamina, pleasure for both partners, and erections that last longer. Additional health benefits include reducing the risk of heart disease and kidney and prostate problems.</span></p>
<p> </p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092239339335/"><strong>It can help you get erections more often, longer,</strong></a><span style="font-weight: 400;"> stronger and more intense. The most important thing to remember is that it will take at least 6 months before the effects will be noticeable. The good news is that once you start noticing an improvement, it will continue to get better as your body continues to adapt to the supplement. The best way to use this product is by taking two tablets twice daily. This will allow your body to absorb the maximum amount of ingredients. If you want to see the fastest results, take one tablet in the morning and one in the evening.</span></p>
<p> </p>
<p><span style="font-weight: 400;">I have been using it for a month now, and my weight is dropping and I am feeling stronger than ever! The only thing I don't like about this product is that you can't buy it in stores. You can only order it online. However, the shipping is really fast and it's super affordable! It's worth every cent! I use this supplement every day, and I am so happy with the results. </span><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092099453174/"><strong>It has changed my life</strong></a><span style="font-weight: 400;">. I can't stop talking about how good it makes me feel!</span></p>
<p> </p>
<h2><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-south-africa"><strong>>>>>>>>>=====CLICK HERE TO BUY & BOOST YOUR SEX POWER=====<<<<<<<<<<</strong></a></span></h2>
<h2><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-south-africa"><strong>>>>>>>>>=====CLICK HERE TO BUY & BOOST YOUR SEX POWER=====<<<<<<<<<<</strong></a></span></h2>
<h2><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-south-africa"><strong>>>>>>>>>=====CLICK HERE TO BUY & BOOST YOUR SEX POWER=====<<<<<<<<<<</strong></a></span></h2>
<p> </p>
<h3><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/AnimaleMaleEnhancementSouthAfricaBuy/"><strong>Conclusion</strong></a></span></h3>
<p> </p>
<p><a href="https://www.facebook.com/AnimaleCBDGummiesZA/"><strong>What makes this product different from other products on the market?</strong></a><span style="font-weight: 400;"> The product is made up of 100% natural ingredients, which means it is completely safe and healthy. With the use of natural herbs, the body can achieve total and permanent solutions. The product is also free from any side effects. This product will not only give you an improved libido but also keep your reproductive system healthy. How does it work? The product contains a blend of several natural herbs and nutrients that help improve the overall fitness of the body.</span></p>
<p> </p>
<p><span style="background-color: #ffff00;"><strong>Our Official Blogs ⇒</strong></span></p>
<p><a href="https://animale-male-enhancement-south-fdb8e3.webflow.io/"><strong>https://animale-male-enhancement-south-fdb8e3.webflow.io/</strong></a></p>
<p><a href="https://animale-male-enhancement-za-36bb71.webflow.io/"><strong>https://animale-male-enhancement-za-36bb71.webflow.io/</strong></a></p>
<p><a href="https://animale-cbd-gummies-south-afri-398d54.webflow.io/"><strong>https://animale-cbd-gummies-south-afri-398d54.webflow.io/</strong></a></p>
<p><a href="https://animale-cbd-gummies-za-ddca93.webflow.io/"><strong>https://animale-cbd-gummies-za-ddca93.webflow.io/</strong></a></p>
<p><a href="https://animale-male-enhancement-south-africa-64.jimdosite.com/"><strong>https://animale-male-enhancement-south-africa-64.jimdosite.com/</strong></a></p>
<p><a href="https://animale-male-enhancement-za-7.jimdosite.com/"><strong>https://animale-male-enhancement-za-7.jimdosite.com/</strong></a></p>
<p><a href="https://animale-cbd-gummies-south-africa-11.jimdosite.com/"><strong>https://animale-cbd-gummies-south-africa-11.jimdosite.com/</strong></a></p>
<p><a href="https://animale-cbd-gummies-za.jimdosite.com/"><strong>https://animale-cbd-gummies-za.jimdosite.com/</strong></a></p>
<p><a href="https://animalemaleenhancementsouthafrica.mystrikingly.com/"><strong>https://animalemaleenhancementsouthafrica.mystrikingly.com/</strong></a></p>
<p><a href="https://animale-male-enhancement-za.mystrikingly.com/"><strong>https://animale-male-enhancement-za.mystrikingly.com/</strong></a></p>
<p><a href="https://animale-cbd-gummies-south-africa.mystrikingly.com/"><strong>https://animale-cbd-gummies-south-africa.mystrikingly.com/</strong></a></p>
<p><a href="https://animale-cbd-gummies-za.mystrikingly.com/"><strong>https://animale-cbd-gummies-za.mystrikingly.com/</strong></a></p>
<p><a href="https://animalemaleenhancementsouth619.godaddysites.com/"><strong>https://animalemaleenhancementsouth619.godaddysites.com/</strong></a></p>
<p><a href="https://animalemaleenhancementza5.godaddysites.com/"><strong>https://animalemaleenhancementza5.godaddysites.com/</strong></a></p>
<p><a href="https://animalecbdgummiessouthafrica.godaddysites.com/"><strong>https://animalecbdgummiessouthafrica.godaddysites.com/</strong></a></p>
<p><a href="https://animalecbdgummiesza.godaddysites.com/"><strong>https://animalecbdgummiesza.godaddysites.com/</strong></a></p>
<p><a href="https://animale-maleenhancement-south-africa.company.site/"><strong>https://animale-maleenhancement-south-africa.company.site/</strong></a></p>
<p><a href="https://animale-cbd-gummies-za.jigsy.com/"><strong>https://animale-cbd-gummies-za.jigsy.com/</strong></a></p>
<p><a href="https://healthcare24x7hrs.blogspot.com/2023/05/animale-male-enhancement-gummies-south.html"><strong>https://healthcare24x7hrs.blogspot.com/2023/05/animale-male-enhancement-gummies-south.html</strong></a></p>
<p><a href="https://healthcare24x7hrs.blogspot.com/2023/05/animale-male-enhancement-south-africa.html"><strong>https://healthcare24x7hrs.blogspot.com/2023/05/animale-male-enhancement-south-africa.html</strong></a></p>
<p><a href="https://sites.google.com/view/animalemegummiessouthafrica/"><strong>https://sites.google.com/view/animalemegummiessouthafrica/</strong></a></p>
<p><a href="https://sites.google.com/view/animale-male-enhancement-in-za/"><strong>https://sites.google.com/view/animale-male-enhancement-in-za/</strong></a></p>
<p><a href="https://groups.google.com/u/1/g/animale-male-enhancement-south-africa-za/c/WRxOLQ-sQDo"><strong>https://groups.google.com/u/1/g/animale-male-enhancement-south-africa-za/c/WRxOLQ-sQDo</strong></a></p>
<p><a href="https://groups.google.com/u/1/g/animale-male-enhancement-za-south-africa/c/d-IKmqeADNU"><strong>https://groups.google.com/u/1/g/animale-male-enhancement-za-south-africa/c/d-IKmqeADNU</strong></a></p>
<p><a href="https://groups.google.com/g/animale-male-enhancement-gummies--capsules-south-africa/c/PIXMAQOLYBM"><strong>https://groups.google.com/g/animale-male-enhancement-gummies--capsules-south-africa/c/PIXMAQOLYBM</strong></a></p>
<p><a href="https://groups.google.com/g/animale-male-enhancement-gummies--capsules-south-africa/c/VgcEIqv9afc"><strong>https://groups.google.com/g/animale-male-enhancement-gummies--capsules-south-africa/c/VgcEIqv9afc</strong></a></p>
<p><a href="https://groups.google.com/g/animale-male-enhancement-gummies--capsules-south-africa/c/TrOXo30F_I4"><strong>https://groups.google.com/g/animale-male-enhancement-gummies--capsules-south-africa/c/TrOXo30F_I4</strong></a></p>
<p><a href="https://lookerstudio.google.com/reporting/d12e9358-9753-4fa4-9427-3551a662342f/page/KjSRD"><strong>https://lookerstudio.google.com/reporting/d12e9358-9753-4fa4-9427-3551a662342f/page/KjSRD</strong></a></p>
<p><a href="https://lookerstudio.google.com/reporting/554936a2-a2ba-461b-8d55-bcf9804e0372/page/4i6aD"><strong>https://lookerstudio.google.com/reporting/554936a2-a2ba-461b-8d55-bcf9804e0372/page/4i6aD</strong></a></p>
<p> </p>
<p><span style="background-color: #ffff00;"><strong>Animale Male Enhancement Australia Official Links ⇒</strong></span></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAU/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInAU/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementPills/"><strong>https://www.facebook.com/AnimaleMaleEnhancementPills/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAustralia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInAustralia/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/"><strong>https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/"><strong>https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesAustralia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesAustralia/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Australia/100093487323174/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-Australia/100093487323174/</strong></a></p>
<p> </p>
<p><span style="background-color: #ffff00;"><strong>Animale Male Enhancement Uruguay Official Links ⇒</strong></span></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInUY/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInUY/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementUruguay/"><strong>https://www.facebook.com/AnimaleMaleEnhancementUruguay/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementUYUruguay/"><strong>https://www.facebook.com/AnimaleMaleEnhancementUYUruguay/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementUruguayUY/"><strong>https://www.facebook.com/AnimaleMaleEnhancementUruguayUY/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementUruguayPrice/"><strong>https://www.facebook.com/AnimaleMaleEnhancementUruguayPrice/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Uruguay/100090983842331/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-Uruguay/100090983842331/</strong></a></p>
<p> </p>
<p><span style="background-color: #ffff00;"><strong>Animale Male Enhancement Venezuela Official Links ⇒</strong></span></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVe/"><strong>https://www.facebook.com/AnimaleMaleEnhancementVe/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVE/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInVE/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVenezuelaVE/"><strong>https://www.facebook.com/AnimaleMaleEnhancementVenezuelaVE/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVEVenezuela/"><strong>https://www.facebook.com/AnimaleMaleEnhancementVEVenezuela/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVenezuela/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInVenezuela/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Venezuela/100090298477014/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-Venezuela/100090298477014/</strong></a></p>
<p> </p>
<p><span style="background-color: #ffff00;"><strong>Animale Male Enhancement Malaysia Official Links ⇒</strong></span></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementMalaysia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementMalaysia/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInMalaysia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInMalaysia/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementMY/"><strong>https://www.facebook.com/AnimaleMaleEnhancementMY/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInMY/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInMY/</strong></a></p>
<p> </p>
<p><span style="background-color: #ffff00;"><strong>Active Keto Gummies Australia & Ireland Official Links ⇒</strong></span></p>
<p><a href="https://www.facebook.com/ActiveKetoGummies.AU.NZ.CA/"><strong>https://www.facebook.com/ActiveKetoGummies.AU.NZ.CA/</strong></a></p>
<p><a href="https://www.facebook.com/ActiveKetoGummiesInAustralia/"><strong>https://www.facebook.com/ActiveKetoGummiesInAustralia/</strong></a></p>
<p><a href="https://www.facebook.com/ActiveKetoGummiesAUAustralia/"><strong>https://www.facebook.com/ActiveKetoGummiesAUAustralia/</strong></a></p>
<p><a href="https://www.facebook.com/ActiveKetoGummiesAUInAustralia/"><strong>https://www.facebook.com/ActiveKetoGummiesAUInAustralia/</strong></a></p>
<p><a href="https://www.facebook.com/ActiveKetoGummiesInIE/"><strong>https://www.facebook.com/ActiveKetoGummiesInIE/</strong></a></p>
<p><a href="https://www.facebook.com/ActiveKetoGummiesOfIE/"><strong>https://www.facebook.com/ActiveKetoGummiesOfIE/</strong></a></p>
<p><a href="https://www.facebook.com/ActiveKetoGummiesOfIreland/"><strong>https://www.facebook.com/ActiveKetoGummiesOfIreland/</strong></a></p>
<p><a href="https://www.facebook.com/ActiveKetoGummiesAtIreland/"><strong>https://www.facebook.com/ActiveKetoGummiesAtIreland/</strong></a></p>
<p><strong><a href="https://www.facebook.com/ActiveKetoGummiesIrelandOfficial/">https://www.facebook.com/ActiveKetoGummiesIrelandOfficial/</a></strong></p>
<p> </p>
<p><span style="background-color: #ffff00;"><strong>Viarecta Deutschland Official Links ⇒</strong></span></p>
<p><a href="https://www.facebook.com/ViarectaDE/"><strong>https://www.facebook.com/ViarectaDE/</strong></a></p>
<p><a href="https://www.facebook.com/ViarectaEbay/"><strong>https://www.facebook.com/ViarectaEbay/</strong></a></p>
<p><a href="https://www.facebook.com/viarectaBeiDM/"><strong>https://www.facebook.com/viarectaBeiDM/</strong></a></p>
<p><a href="https://www.facebook.com/viarectakaufen/"><strong>https://www.facebook.com/viarectakaufen/</strong></a></p>
<p><a href="https://www.facebook.com/ViarectaInGermany/"><strong>https://www.facebook.com/ViarectaInGermany/</strong></a></p>
<p><a href="https://www.facebook.com/viarectaBeiAmazon/"><strong>https://www.facebook.com/viarectaBeiAmazon/</strong></a></p>
<p><a href="https://www.facebook.com/ViarectaDeutschland/"><strong>https://www.facebook.com/ViarectaDeutschland/</strong></a></p>
<p><strong><a href="https://www.facebook.com/ViagraKaufen/">https://www.facebook.com/ViagraKaufen/</a></strong></p>
<p> </p>
<h3><span style="background-color: #ffff00;"><strong>Related Searches : </strong></span></h3>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementSouthAfricaBuy/"><strong>#AnimaleMaleEnhancementSouthAfrica</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleCBDGummiesZA/"><strong>#AnimaleMaleEnhancementZA</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesSouthAfrica/"><strong>#AnimaleMaleEnhancementDischem</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092203209665/"><strong>#AnimaleMaleEnhancementPriceAtClicks</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092239339335/"><strong>#AnimaleMaleEnhancementBuy</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092099453174/"><strong>#AnimaleMaleEnhancementOfficial</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementSouthAfricaBuy/"><strong>#AnimaleMaleEnhancementShopNow</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleCBDGummiesZA/"><strong>#AnimaleMaleEnhancementOffer</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesSouthAfrica"><strong>#AnimaleMaleEnhancementDiscount</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092203209665/"><strong>#AnimaleMaleEnhancementOrder</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092239339335/"><strong>#AnimaleMaleEnhancementBenefits</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092099453174/"><strong>#AnimaleMaleEnhancementScam</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementSouthAfricaBuy/"><strong>#AnimaleMaleEnhancementSexBooster</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleCBDGummiesZA/"><strong>#AnimaleMaleEnhancementPenisEnlargement</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesSouthAfrica/"><strong>#AnimaleMaleEnhancementStaminaBooster</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092203209665/"><strong>#AnimaleMaleEnhancementIngredients</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092239339335/"><strong>#AnimaleMaleEnhancementPurchase</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092099453174/"><strong>#AnimaleMaleEnhancementGummiesSouthAfrica</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementSouthAfricaBuy/"><strong>#AnimaleMaleEnhancementSouthAfricaReview</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleCBDGummiesZA/"><strong>#AnimaleMaleEnhancementSouthAfricaReviews</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesSouthAfrica/"><strong>#AnimaleMaleEnhancementGummiesZA</strong></a></p> |
AnimaleMaleEnhancementZA/AnimaleMaleEnhancementGummiesCapsulesSouthAfrica | 2023-08-29T13:03:31.000Z | [
"region:us"
] | AnimaleMaleEnhancementZA | null | null | null | 0 | 0 | <h3><span style="background-color: #ffff00;"><strong>Our Official Facebook Links ⇒</strong></span></h3>
<p><a href="https://www.facebook.com/AnimaleCBDGummiesZA/"><strong>https://www.facebook.com/AnimaleCBDGummiesZA/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementSouthAfricaBuy/"><strong>https://www.facebook.com/AnimaleMaleEnhancementSouthAfricaBuy/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesSouthAfrica/"><strong>https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesSouthAfrica/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092203209665/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092203209665/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092239339335/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092239339335/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092099453174/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092099453174/</strong></a></p>
<p><a href="https://www.facebook.com/events/1121615602562904/"><strong>https://www.facebook.com/events/1121615602562904/</strong></a></p>
<p><a href="https://www.facebook.com/events/1295846104688434/"><strong>https://www.facebook.com/events/1295846104688434/</strong></a></p>
<p><a href="https://www.facebook.com/events/1429727191099071/"><strong>https://www.facebook.com/events/1429727191099071/</strong></a></p>
<p> </p>
<h3><span style="font-weight: 400;">➥ Product Name — </span><span style="color: #800000;"><a style="color: #800000;" href="https://www.facebook.com/AnimaleMaleEnhancementSouthAfricaBuy/"><strong>Animale Male Enhancement</strong></a></span><span style="font-weight: 400;"> </span></h3>
<h3><span style="font-weight: 400;">➥ Country — </span><span style="color: #800000;"><a style="color: #800000;" href="https://www.facebook.com/AnimaleCBDGummiesZA/"><strong>South Africa</strong></a></span></h3>
<h3><span style="font-weight: 400;">➥ Main Benefits — </span><span style="color: #800000;"><a style="color: #800000;" href="https://healthcare24hrs.com/animal-male-enhancement-south-africa"><strong>Male Enhancement</strong></a></span></h3>
<h3><span style="font-weight: 400;">➥ Rating —</span> <span style="color: #800000;"><a style="color: #800000;" href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesSouthAfrica/"><strong>5.0/5.0</strong></a></span><span style="font-weight: 400;"> ⭐⭐⭐⭐⭐</span></h3>
<h3><span style="font-weight: 400;">➥ Results — </span><span style="color: #800000;"><a style="color: #800000;" href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092203209665/"><strong>In 1-3 Months</strong></a></span></h3>
<h3><span style="font-weight: 400;">➥ Availability — </span><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-south-africa"><strong>Online</strong></a></span></h3>
<h3><span style="font-weight: 400;">➥ Side Effects — </span><span style="color: #800000;"><a style="color: #800000;" href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092239339335/"><strong>No Major Side Effects</strong></a></span></h3>
<h3><span style="font-weight: 400;">➥ Official Website (Sale Is Live) — </span><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-south-africa"><strong>Click Here To Order Animale Male Enhancement South Africa</strong></a></span></h3>
<p> </p>
<p><span style="font-weight: 400;">There are many ways to access the system for sexual health care. </span><a href="https://www.facebook.com/AnimaleMaleEnhancementSouthAfricaBuy/"><strong>Sexual health</strong></a><span style="font-weight: 400;"> is best during your 20s, and your sexual peak is between the ages of 20 and 50. As we get older, our performance declines. However, the ability to perform at the highest level possible stays the same. As we get older, our sexual desire decreases, and further sexual health deteriorates, which makes us less attractive in bed.</span></p>
<p> </p>
<h2><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-south-africa"><strong>>>>>>>>>=====CLICK HERE TO BUY & BOOST YOUR SEX POWER=====<<<<<<<<<<</strong></a></span></h2>
<h2><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-south-africa"><strong>>>>>>>>>=====CLICK HERE TO BUY & BOOST YOUR SEX POWER=====<<<<<<<<<<</strong></a></span></h2>
<h2><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-south-africa"><strong>>>>>>>>>=====CLICK HERE TO BUY & BOOST YOUR SEX POWER=====<<<<<<<<<<</strong></a></span></h2>
<p> </p>
<p><span style="background-color: #ffff00;"><strong>Our Official Blogs ⇒</strong></span></p>
<p><a href="https://animale-male-enhancement-south-fdb8e3.webflow.io/"><strong>https://animale-male-enhancement-south-fdb8e3.webflow.io/</strong></a></p>
<p><a href="https://animale-male-enhancement-za-36bb71.webflow.io/"><strong>https://animale-male-enhancement-za-36bb71.webflow.io/</strong></a></p>
<p><a href="https://animale-cbd-gummies-south-afri-398d54.webflow.io/"><strong>https://animale-cbd-gummies-south-afri-398d54.webflow.io/</strong></a></p>
<p><a href="https://animale-cbd-gummies-za-ddca93.webflow.io/"><strong>https://animale-cbd-gummies-za-ddca93.webflow.io/</strong></a></p>
<p><a href="https://animale-male-enhancement-south-africa-64.jimdosite.com/"><strong>https://animale-male-enhancement-south-africa-64.jimdosite.com/</strong></a></p>
<p><a href="https://animale-male-enhancement-za-7.jimdosite.com/"><strong>https://animale-male-enhancement-za-7.jimdosite.com/</strong></a></p>
<p><a href="https://animale-cbd-gummies-south-africa-11.jimdosite.com/"><strong>https://animale-cbd-gummies-south-africa-11.jimdosite.com/</strong></a></p>
<p><a href="https://animale-cbd-gummies-za.jimdosite.com/"><strong>https://animale-cbd-gummies-za.jimdosite.com/</strong></a></p>
<p><a href="https://animalemaleenhancementsouthafrica.mystrikingly.com/"><strong>https://animalemaleenhancementsouthafrica.mystrikingly.com/</strong></a></p>
<p><a href="https://animale-male-enhancement-za.mystrikingly.com/"><strong>https://animale-male-enhancement-za.mystrikingly.com/</strong></a></p>
<p><a href="https://animale-cbd-gummies-south-africa.mystrikingly.com/"><strong>https://animale-cbd-gummies-south-africa.mystrikingly.com/</strong></a></p>
<p><a href="https://animale-cbd-gummies-za.mystrikingly.com/"><strong>https://animale-cbd-gummies-za.mystrikingly.com/</strong></a></p>
<p><a href="https://animalemaleenhancementsouth619.godaddysites.com/"><strong>https://animalemaleenhancementsouth619.godaddysites.com/</strong></a></p>
<p><a href="https://animalemaleenhancementza5.godaddysites.com/"><strong>https://animalemaleenhancementza5.godaddysites.com/</strong></a></p>
<p><a href="https://animalecbdgummiessouthafrica.godaddysites.com/"><strong>https://animalecbdgummiessouthafrica.godaddysites.com/</strong></a></p>
<p><a href="https://animalecbdgummiesza.godaddysites.com/"><strong>https://animalecbdgummiesza.godaddysites.com/</strong></a></p>
<p><a href="https://animale-maleenhancement-south-africa.company.site/"><strong>https://animale-maleenhancement-south-africa.company.site/</strong></a></p>
<p><a href="https://animale-cbd-gummies-za.jigsy.com/"><strong>https://animale-cbd-gummies-za.jigsy.com/</strong></a></p>
<p><a href="https://healthcare24x7hrs.blogspot.com/2023/05/animale-male-enhancement-gummies-south.html"><strong>https://healthcare24x7hrs.blogspot.com/2023/05/animale-male-enhancement-gummies-south.html</strong></a></p>
<p><a href="https://healthcare24x7hrs.blogspot.com/2023/05/animale-male-enhancement-south-africa.html"><strong>https://healthcare24x7hrs.blogspot.com/2023/05/animale-male-enhancement-south-africa.html</strong></a></p>
<p><a href="https://sites.google.com/view/animalemegummiessouthafrica/"><strong>https://sites.google.com/view/animalemegummiessouthafrica/</strong></a></p>
<p><a href="https://sites.google.com/view/animale-male-enhancement-in-za/"><strong>https://sites.google.com/view/animale-male-enhancement-in-za/</strong></a></p>
<p><a href="https://groups.google.com/u/1/g/animale-male-enhancement-south-africa-za/c/WRxOLQ-sQDo"><strong>https://groups.google.com/u/1/g/animale-male-enhancement-south-africa-za/c/WRxOLQ-sQDo</strong></a></p>
<p><a href="https://groups.google.com/u/1/g/animale-male-enhancement-za-south-africa/c/d-IKmqeADNU"><strong>https://groups.google.com/u/1/g/animale-male-enhancement-za-south-africa/c/d-IKmqeADNU</strong></a></p>
<p><a href="https://groups.google.com/g/animale-male-enhancement-gummies--capsules-south-africa/c/PIXMAQOLYBM"><strong>https://groups.google.com/g/animale-male-enhancement-gummies--capsules-south-africa/c/PIXMAQOLYBM</strong></a></p>
<p><a href="https://groups.google.com/g/animale-male-enhancement-gummies--capsules-south-africa/c/VgcEIqv9afc"><strong>https://groups.google.com/g/animale-male-enhancement-gummies--capsules-south-africa/c/VgcEIqv9afc</strong></a></p>
<p><a href="https://groups.google.com/g/animale-male-enhancement-gummies--capsules-south-africa/c/TrOXo30F_I4"><strong>https://groups.google.com/g/animale-male-enhancement-gummies--capsules-south-africa/c/TrOXo30F_I4</strong></a></p>
<p><a href="https://lookerstudio.google.com/reporting/d12e9358-9753-4fa4-9427-3551a662342f/page/KjSRD"><strong>https://lookerstudio.google.com/reporting/d12e9358-9753-4fa4-9427-3551a662342f/page/KjSRD</strong></a></p>
<p><a href="https://lookerstudio.google.com/reporting/554936a2-a2ba-461b-8d55-bcf9804e0372/page/4i6aD"><strong>https://lookerstudio.google.com/reporting/554936a2-a2ba-461b-8d55-bcf9804e0372/page/4i6aD</strong></a></p>
<p> </p>
<p><span style="background-color: #ffff00;"><strong>Animale Male Enhancement Australia Official Links ⇒</strong></span></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAU/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInAU/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementPills/"><strong>https://www.facebook.com/AnimaleMaleEnhancementPills/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAustralia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInAustralia/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/"><strong>https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/"><strong>https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesAustralia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesAustralia/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Australia/100093487323174/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-Australia/100093487323174/</strong></a></p>
<p> </p>
<p><span style="background-color: #ffff00;"><strong>Animale Male Enhancement Uruguay Official Links ⇒</strong></span></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInUY/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInUY/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementUruguay/"><strong>https://www.facebook.com/AnimaleMaleEnhancementUruguay/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementUYUruguay/"><strong>https://www.facebook.com/AnimaleMaleEnhancementUYUruguay/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementUruguayUY/"><strong>https://www.facebook.com/AnimaleMaleEnhancementUruguayUY/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementUruguayPrice/"><strong>https://www.facebook.com/AnimaleMaleEnhancementUruguayPrice/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Uruguay/100090983842331/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-Uruguay/100090983842331/</strong></a></p>
<p> </p>
<p><span style="background-color: #ffff00;"><strong>Animale Male Enhancement Venezuela Official Links ⇒</strong></span></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVe/"><strong>https://www.facebook.com/AnimaleMaleEnhancementVe/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVE/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInVE/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVenezuelaVE/"><strong>https://www.facebook.com/AnimaleMaleEnhancementVenezuelaVE/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVEVenezuela/"><strong>https://www.facebook.com/AnimaleMaleEnhancementVEVenezuela/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVenezuela/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInVenezuela/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Venezuela/100090298477014/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-Venezuela/100090298477014/</strong></a></p>
<p> </p>
<p><span style="background-color: #ffff00;"><strong>Animale Male Enhancement Malaysia Official Links ⇒</strong></span></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementMalaysia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementMalaysia/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInMalaysia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInMalaysia/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementMY/"><strong>https://www.facebook.com/AnimaleMaleEnhancementMY/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInMY/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInMY/</strong></a></p>
<p> </p>
<p><span style="background-color: #ffff00;"><strong>Active Keto Gummies Australia & Ireland Official Links ⇒</strong></span></p>
<p><a href="https://www.facebook.com/ActiveKetoGummies.AU.NZ.CA/"><strong>https://www.facebook.com/ActiveKetoGummies.AU.NZ.CA/</strong></a></p>
<p><a href="https://www.facebook.com/ActiveKetoGummiesInAustralia/"><strong>https://www.facebook.com/ActiveKetoGummiesInAustralia/</strong></a></p>
<p><a href="https://www.facebook.com/ActiveKetoGummiesAUAustralia/"><strong>https://www.facebook.com/ActiveKetoGummiesAUAustralia/</strong></a></p>
<p><a href="https://www.facebook.com/ActiveKetoGummiesAUInAustralia/"><strong>https://www.facebook.com/ActiveKetoGummiesAUInAustralia/</strong></a></p>
<p><a href="https://www.facebook.com/ActiveKetoGummiesInIE/"><strong>https://www.facebook.com/ActiveKetoGummiesInIE/</strong></a></p>
<p><a href="https://www.facebook.com/ActiveKetoGummiesOfIE/"><strong>https://www.facebook.com/ActiveKetoGummiesOfIE/</strong></a></p>
<p><a href="https://www.facebook.com/ActiveKetoGummiesOfIreland/"><strong>https://www.facebook.com/ActiveKetoGummiesOfIreland/</strong></a></p>
<p><a href="https://www.facebook.com/ActiveKetoGummiesAtIreland/"><strong>https://www.facebook.com/ActiveKetoGummiesAtIreland/</strong></a></p>
<p><strong><a href="https://www.facebook.com/ActiveKetoGummiesIrelandOfficial/">https://www.facebook.com/ActiveKetoGummiesIrelandOfficial/</a></strong></p>
<p> </p>
<p><span style="background-color: #ffff00;"><strong>Viarecta Deutschland Official Links ⇒</strong></span></p>
<p><a href="https://www.facebook.com/ViarectaDE/"><strong>https://www.facebook.com/ViarectaDE/</strong></a></p>
<p><a href="https://www.facebook.com/ViarectaEbay/"><strong>https://www.facebook.com/ViarectaEbay/</strong></a></p>
<p><a href="https://www.facebook.com/viarectaBeiDM/"><strong>https://www.facebook.com/viarectaBeiDM/</strong></a></p>
<p><a href="https://www.facebook.com/viarectakaufen/"><strong>https://www.facebook.com/viarectakaufen/</strong></a></p>
<p><a href="https://www.facebook.com/ViarectaInGermany/"><strong>https://www.facebook.com/ViarectaInGermany/</strong></a></p>
<p><a href="https://www.facebook.com/viarectaBeiAmazon/"><strong>https://www.facebook.com/viarectaBeiAmazon/</strong></a></p>
<p><a href="https://www.facebook.com/ViarectaDeutschland/"><strong>https://www.facebook.com/ViarectaDeutschland/</strong></a></p>
<p><strong><a href="https://www.facebook.com/ViagraKaufen/">https://www.facebook.com/ViagraKaufen/</a></strong></p>
<p> </p>
<h3><span style="background-color: #ffff00;"><strong>Related Searches : </strong></span></h3>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementSouthAfricaBuy/"><strong>#AnimaleMaleEnhancementSouthAfrica</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleCBDGummiesZA/"><strong>#AnimaleMaleEnhancementZA</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesSouthAfrica/"><strong>#AnimaleMaleEnhancementDischem</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092203209665/"><strong>#AnimaleMaleEnhancementPriceAtClicks</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092239339335/"><strong>#AnimaleMaleEnhancementBuy</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092099453174/"><strong>#AnimaleMaleEnhancementOfficial</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementSouthAfricaBuy/"><strong>#AnimaleMaleEnhancementShopNow</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleCBDGummiesZA/"><strong>#AnimaleMaleEnhancementOffer</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesSouthAfrica"><strong>#AnimaleMaleEnhancementDiscount</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092203209665/"><strong>#AnimaleMaleEnhancementOrder</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092239339335/"><strong>#AnimaleMaleEnhancementBenefits</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092099453174/"><strong>#AnimaleMaleEnhancementScam</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementSouthAfricaBuy/"><strong>#AnimaleMaleEnhancementSexBooster</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleCBDGummiesZA/"><strong>#AnimaleMaleEnhancementPenisEnlargement</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesSouthAfrica/"><strong>#AnimaleMaleEnhancementStaminaBooster</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092203209665/"><strong>#AnimaleMaleEnhancementIngredients</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092239339335/"><strong>#AnimaleMaleEnhancementPurchase</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092099453174/"><strong>#AnimaleMaleEnhancementGummiesSouthAfrica</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementSouthAfricaBuy/"><strong>#AnimaleMaleEnhancementSouthAfricaReview</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleCBDGummiesZA/"><strong>#AnimaleMaleEnhancementSouthAfricaReviews</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesSouthAfrica/"><strong>#AnimaleMaleEnhancementGummiesZA</strong></a></p> |
yzhuang/autotree_automl_heloc_sgosdt_l256_d3_sd0 | 2023-08-31T07:10:02.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float32
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 328560000
num_examples: 10000
- name: validation
num_bytes: 328560000
num_examples: 10000
download_size: 133253810
dataset_size: 657120000
---
# Dataset Card for "autotree_automl_heloc_sgosdt_l256_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.