id stringlengths 2 115 | lastModified stringlengths 24 24 | tags list | author stringlengths 2 42 ⌀ | description stringlengths 0 6.67k ⌀ | citation stringlengths 0 10.7k ⌀ | likes int64 0 3.66k | downloads int64 0 8.89M | created timestamp[us] | card stringlengths 11 977k | card_len int64 11 977k | embeddings list |
|---|---|---|---|---|---|---|---|---|---|---|---|
open-llm-leaderboard/details_llm-agents__tora-code-7b-v1.0 | 2023-10-28T11:51:10.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-10T14:13:09 | ---
pretty_name: Evaluation run of llm-agents/tora-code-7b-v1.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [llm-agents/tora-code-7b-v1.0](https://huggingface.co/llm-agents/tora-code-7b-v1.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_llm-agents__tora-code-7b-v1.0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-28T11:50:58.128612](https://huggingface.co/datasets/open-llm-leaderboard/details_llm-agents__tora-code-7b-v1.0/blob/main/results_2023-10-28T11-50-58.128612.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0010486577181208054,\n\
\ \"em_stderr\": 0.00033145814652192884,\n \"f1\": 0.04895343959731551,\n\
\ \"f1_stderr\": 0.0011757746481772687,\n \"acc\": 0.33245361192407963,\n\
\ \"acc_stderr\": 0.009816859128324334\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0010486577181208054,\n \"em_stderr\": 0.00033145814652192884,\n\
\ \"f1\": 0.04895343959731551,\n \"f1_stderr\": 0.0011757746481772687\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.04927975739196361,\n \
\ \"acc_stderr\": 0.005962150655812473\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6156274664561957,\n \"acc_stderr\": 0.013671567600836194\n\
\ }\n}\n```"
repo_url: https://huggingface.co/llm-agents/tora-code-7b-v1.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|arc:challenge|25_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_28T11_50_58.128612
path:
- '**/details_harness|drop|3_2023-10-28T11-50-58.128612.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-28T11-50-58.128612.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_28T11_50_58.128612
path:
- '**/details_harness|gsm8k|5_2023-10-28T11-50-58.128612.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-28T11-50-58.128612.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hellaswag|10_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_28T11_50_58.128612
path:
- '**/details_harness|winogrande|5_2023-10-28T11-50-58.128612.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-28T11-50-58.128612.parquet'
- config_name: results
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- results_2023-10-10T14-12-45.914011.parquet
- split: 2023_10_28T11_50_58.128612
path:
- results_2023-10-28T11-50-58.128612.parquet
- split: latest
path:
- results_2023-10-28T11-50-58.128612.parquet
---
# Dataset Card for Evaluation run of llm-agents/tora-code-7b-v1.0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/llm-agents/tora-code-7b-v1.0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [llm-agents/tora-code-7b-v1.0](https://huggingface.co/llm-agents/tora-code-7b-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_llm-agents__tora-code-7b-v1.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-28T11:50:58.128612](https://huggingface.co/datasets/open-llm-leaderboard/details_llm-agents__tora-code-7b-v1.0/blob/main/results_2023-10-28T11-50-58.128612.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0010486577181208054,
"em_stderr": 0.00033145814652192884,
"f1": 0.04895343959731551,
"f1_stderr": 0.0011757746481772687,
"acc": 0.33245361192407963,
"acc_stderr": 0.009816859128324334
},
"harness|drop|3": {
"em": 0.0010486577181208054,
"em_stderr": 0.00033145814652192884,
"f1": 0.04895343959731551,
"f1_stderr": 0.0011757746481772687
},
"harness|gsm8k|5": {
"acc": 0.04927975739196361,
"acc_stderr": 0.005962150655812473
},
"harness|winogrande|5": {
"acc": 0.6156274664561957,
"acc_stderr": 0.013671567600836194
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,670 | [
[
-0.027313232421875,
-0.0433349609375,
0.0215911865234375,
0.019439697265625,
-0.01268768310546875,
0.0211639404296875,
-0.0181427001953125,
-0.012908935546875,
0.035797119140625,
0.042694091796875,
-0.04986572265625,
-0.06982421875,
-0.046630859375,
0.020034... |
open-llm-leaderboard/details_llm-agents__tora-7b-v1.0 | 2023-10-27T12:52:43.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-10T14:34:35 | ---
pretty_name: Evaluation run of llm-agents/tora-7b-v1.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [llm-agents/tora-7b-v1.0](https://huggingface.co/llm-agents/tora-7b-v1.0) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_llm-agents__tora-7b-v1.0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-27T12:52:31.057587](https://huggingface.co/datasets/open-llm-leaderboard/details_llm-agents__tora-7b-v1.0/blob/main/results_2023-10-27T12-52-31.057587.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.03166946308724832,\n\
\ \"em_stderr\": 0.001793377907859907,\n \"f1\": 0.0924370805369127,\n\
\ \"f1_stderr\": 0.002203336567209257,\n \"acc\": 0.3803074247848667,\n\
\ \"acc_stderr\": 0.008348384971774042\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.03166946308724832,\n \"em_stderr\": 0.001793377907859907,\n\
\ \"f1\": 0.0924370805369127,\n \"f1_stderr\": 0.002203336567209257\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.025018953752843062,\n \
\ \"acc_stderr\": 0.0043020450465642845\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7355958958168903,\n \"acc_stderr\": 0.012394724896983799\n\
\ }\n}\n```"
repo_url: https://huggingface.co/llm-agents/tora-7b-v1.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|arc:challenge|25_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_27T12_52_31.057587
path:
- '**/details_harness|drop|3_2023-10-27T12-52-31.057587.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-27T12-52-31.057587.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_27T12_52_31.057587
path:
- '**/details_harness|gsm8k|5_2023-10-27T12-52-31.057587.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-27T12-52-31.057587.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hellaswag|10_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_27T12_52_31.057587
path:
- '**/details_harness|winogrande|5_2023-10-27T12-52-31.057587.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-27T12-52-31.057587.parquet'
- config_name: results
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- results_2023-10-10T14-34-11.685092.parquet
- split: 2023_10_27T12_52_31.057587
path:
- results_2023-10-27T12-52-31.057587.parquet
- split: latest
path:
- results_2023-10-27T12-52-31.057587.parquet
---
# Dataset Card for Evaluation run of llm-agents/tora-7b-v1.0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/llm-agents/tora-7b-v1.0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [llm-agents/tora-7b-v1.0](https://huggingface.co/llm-agents/tora-7b-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_llm-agents__tora-7b-v1.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-27T12:52:31.057587](https://huggingface.co/datasets/open-llm-leaderboard/details_llm-agents__tora-7b-v1.0/blob/main/results_2023-10-27T12-52-31.057587.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.03166946308724832,
"em_stderr": 0.001793377907859907,
"f1": 0.0924370805369127,
"f1_stderr": 0.002203336567209257,
"acc": 0.3803074247848667,
"acc_stderr": 0.008348384971774042
},
"harness|drop|3": {
"em": 0.03166946308724832,
"em_stderr": 0.001793377907859907,
"f1": 0.0924370805369127,
"f1_stderr": 0.002203336567209257
},
"harness|gsm8k|5": {
"acc": 0.025018953752843062,
"acc_stderr": 0.0043020450465642845
},
"harness|winogrande|5": {
"acc": 0.7355958958168903,
"acc_stderr": 0.012394724896983799
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,588 | [
[
-0.0276641845703125,
-0.04534912109375,
0.0231170654296875,
0.0189666748046875,
-0.0141143798828125,
0.021270751953125,
-0.0169677734375,
-0.0136871337890625,
0.0374755859375,
0.04278564453125,
-0.05084228515625,
-0.07012939453125,
-0.048004150390625,
0.0193... |
mponty/code_champs_meta | 2023-10-10T15:08:17.000Z | [
"region:us"
] | mponty | null | null | 1 | 0 | 2023-10-10T14:53:16 | ---
dataset_info:
features:
- name: problem_id
dtype: string
- name: contest
dtype: string
- name: problem
dtype: string
- name: lang
dtype: string
- name: problem_title
dtype: string
- name: problem_statement
dtype: string
- name: page
dtype: string
- name: long_tags
dtype: string
- name: short_tags
dtype: string
- name: tutorial_link
dtype: string
- name: tutorial_page
dtype: string
splits:
- name: train
num_bytes: 3683371047
num_examples: 16504
download_size: 457183665
dataset_size: 3683371047
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "code_champs_meta"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 846 | [
[
-0.041046142578125,
-0.014312744140625,
0.00942230224609375,
0.00273895263671875,
-0.005496978759765625,
0.0099945068359375,
0.01055145263671875,
-0.0006203651428222656,
0.051727294921875,
0.036376953125,
-0.05169677734375,
-0.06353759765625,
-0.03955078125,
... |
open-llm-leaderboard/details_lgaalves__tinyllama-1.1b-chat-v0.3_platypus | 2023-10-23T23:05:17.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-10T14:54:14 | ---
pretty_name: Evaluation run of lgaalves/tinyllama-1.1b-chat-v0.3_platypus
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lgaalves/tinyllama-1.1b-chat-v0.3_platypus](https://huggingface.co/lgaalves/tinyllama-1.1b-chat-v0.3_platypus)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lgaalves__tinyllama-1.1b-chat-v0.3_platypus\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-23T23:05:04.270048](https://huggingface.co/datasets/open-llm-leaderboard/details_lgaalves__tinyllama-1.1b-chat-v0.3_platypus/blob/main/results_2023-10-23T23-05-04.270048.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0025167785234899327,\n\
\ \"em_stderr\": 0.0005131152834514911,\n \"f1\": 0.049414848993288615,\n\
\ \"f1_stderr\": 0.0012773102707031435,\n \"acc\": 0.2816590502599073,\n\
\ \"acc_stderr\": 0.00797944490002852\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0025167785234899327,\n \"em_stderr\": 0.0005131152834514911,\n\
\ \"f1\": 0.049414848993288615,\n \"f1_stderr\": 0.0012773102707031435\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.00530705079605762,\n \
\ \"acc_stderr\": 0.002001305720948044\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5580110497237569,\n \"acc_stderr\": 0.013957584079108994\n\
\ }\n}\n```"
repo_url: https://huggingface.co/lgaalves/tinyllama-1.1b-chat-v0.3_platypus
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|arc:challenge|25_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_23T23_05_04.270048
path:
- '**/details_harness|drop|3_2023-10-23T23-05-04.270048.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-23T23-05-04.270048.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_23T23_05_04.270048
path:
- '**/details_harness|gsm8k|5_2023-10-23T23-05-04.270048.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-23T23-05-04.270048.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hellaswag|10_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_23T23_05_04.270048
path:
- '**/details_harness|winogrande|5_2023-10-23T23-05-04.270048.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-23T23-05-04.270048.parquet'
- config_name: results
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- results_2023-10-10T14-53-56.428911.parquet
- split: 2023_10_23T23_05_04.270048
path:
- results_2023-10-23T23-05-04.270048.parquet
- split: latest
path:
- results_2023-10-23T23-05-04.270048.parquet
---
# Dataset Card for Evaluation run of lgaalves/tinyllama-1.1b-chat-v0.3_platypus
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/lgaalves/tinyllama-1.1b-chat-v0.3_platypus
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [lgaalves/tinyllama-1.1b-chat-v0.3_platypus](https://huggingface.co/lgaalves/tinyllama-1.1b-chat-v0.3_platypus) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lgaalves__tinyllama-1.1b-chat-v0.3_platypus",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-23T23:05:04.270048](https://huggingface.co/datasets/open-llm-leaderboard/details_lgaalves__tinyllama-1.1b-chat-v0.3_platypus/blob/main/results_2023-10-23T23-05-04.270048.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0025167785234899327,
"em_stderr": 0.0005131152834514911,
"f1": 0.049414848993288615,
"f1_stderr": 0.0012773102707031435,
"acc": 0.2816590502599073,
"acc_stderr": 0.00797944490002852
},
"harness|drop|3": {
"em": 0.0025167785234899327,
"em_stderr": 0.0005131152834514911,
"f1": 0.049414848993288615,
"f1_stderr": 0.0012773102707031435
},
"harness|gsm8k|5": {
"acc": 0.00530705079605762,
"acc_stderr": 0.002001305720948044
},
"harness|winogrande|5": {
"acc": 0.5580110497237569,
"acc_stderr": 0.013957584079108994
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,834 | [
[
-0.0250244140625,
-0.05352783203125,
0.0194091796875,
0.0182037353515625,
-0.011810302734375,
0.00827789306640625,
-0.03594970703125,
-0.0146484375,
0.03570556640625,
0.034393310546875,
-0.05377197265625,
-0.0643310546875,
-0.04608154296875,
0.00746154785156... |
simayy/ml4se-test-dataset | 2023-10-10T14:56:21.000Z | [
"region:us"
] | simayy | null | null | 0 | 0 | 2023-10-10T14:56:21 | Entry not found | 15 | [
[
-0.0213775634765625,
-0.01497650146484375,
0.05718994140625,
0.0288543701171875,
-0.0350341796875,
0.046478271484375,
0.052490234375,
0.005062103271484375,
0.051361083984375,
0.01702880859375,
-0.05206298828125,
-0.01494598388671875,
-0.06036376953125,
0.037... |
open-llm-leaderboard/details_llm-agents__tora-code-13b-v1.0 | 2023-10-23T13:30:07.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-10T14:56:43 | ---
pretty_name: Evaluation run of llm-agents/tora-code-13b-v1.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [llm-agents/tora-code-13b-v1.0](https://huggingface.co/llm-agents/tora-code-13b-v1.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_llm-agents__tora-code-13b-v1.0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-23T13:29:53.824155](https://huggingface.co/datasets/open-llm-leaderboard/details_llm-agents__tora-code-13b-v1.0/blob/main/results_2023-10-23T13-29-53.824155.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0019924496644295304,\n\
\ \"em_stderr\": 0.00045666764626670027,\n \"f1\": 0.0450398489932886,\n\
\ \"f1_stderr\": 0.0010718150921397497,\n \"acc\": 0.35388406825624874,\n\
\ \"acc_stderr\": 0.010576065743023385\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0019924496644295304,\n \"em_stderr\": 0.00045666764626670027,\n\
\ \"f1\": 0.0450398489932886,\n \"f1_stderr\": 0.0010718150921397497\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08188021228203184,\n \
\ \"acc_stderr\": 0.007552338527716949\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6258879242304657,\n \"acc_stderr\": 0.013599792958329823\n\
\ }\n}\n```"
repo_url: https://huggingface.co/llm-agents/tora-code-13b-v1.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|arc:challenge|25_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_23T13_29_53.824155
path:
- '**/details_harness|drop|3_2023-10-23T13-29-53.824155.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-23T13-29-53.824155.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_23T13_29_53.824155
path:
- '**/details_harness|gsm8k|5_2023-10-23T13-29-53.824155.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-23T13-29-53.824155.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hellaswag|10_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_23T13_29_53.824155
path:
- '**/details_harness|winogrande|5_2023-10-23T13-29-53.824155.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-23T13-29-53.824155.parquet'
- config_name: results
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- results_2023-10-10T14-56-19.008780.parquet
- split: 2023_10_23T13_29_53.824155
path:
- results_2023-10-23T13-29-53.824155.parquet
- split: latest
path:
- results_2023-10-23T13-29-53.824155.parquet
---
# Dataset Card for Evaluation run of llm-agents/tora-code-13b-v1.0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/llm-agents/tora-code-13b-v1.0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [llm-agents/tora-code-13b-v1.0](https://huggingface.co/llm-agents/tora-code-13b-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_llm-agents__tora-code-13b-v1.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-23T13:29:53.824155](https://huggingface.co/datasets/open-llm-leaderboard/details_llm-agents__tora-code-13b-v1.0/blob/main/results_2023-10-23T13-29-53.824155.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0019924496644295304,
"em_stderr": 0.00045666764626670027,
"f1": 0.0450398489932886,
"f1_stderr": 0.0010718150921397497,
"acc": 0.35388406825624874,
"acc_stderr": 0.010576065743023385
},
"harness|drop|3": {
"em": 0.0019924496644295304,
"em_stderr": 0.00045666764626670027,
"f1": 0.0450398489932886,
"f1_stderr": 0.0010718150921397497
},
"harness|gsm8k|5": {
"acc": 0.08188021228203184,
"acc_stderr": 0.007552338527716949
},
"harness|winogrande|5": {
"acc": 0.6258879242304657,
"acc_stderr": 0.013599792958329823
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,678 | [
[
-0.0274505615234375,
-0.04583740234375,
0.02081298828125,
0.0214385986328125,
-0.01120758056640625,
0.021636962890625,
-0.0191802978515625,
-0.0128021240234375,
0.03692626953125,
0.04083251953125,
-0.053558349609375,
-0.0693359375,
-0.047088623046875,
0.0197... |
open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-7B-v1 | 2023-10-10T14:58:44.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-10T14:57:44 | ---
pretty_name: Evaluation run of Severian/ANIMA-Phi-Neptune-Mistral-7B-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Severian/ANIMA-Phi-Neptune-Mistral-7B-v1](https://huggingface.co/Severian/ANIMA-Phi-Neptune-Mistral-7B-v1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-7B-v1\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T14:57:20.867230](https://huggingface.co/datasets/open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-7B-v1/blob/main/results_2023-10-10T14-57-20.867230.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5221924256666464,\n\
\ \"acc_stderr\": 0.03497779761198706,\n \"acc_norm\": 0.5257525929962562,\n\
\ \"acc_norm_stderr\": 0.03496709701060229,\n \"mc1\": 0.4112607099143207,\n\
\ \"mc1_stderr\": 0.01722562708366086,\n \"mc2\": 0.5936287801538656,\n\
\ \"mc2_stderr\": 0.015090925037000012\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.014611390804670088,\n \
\ \"acc_norm\": 0.5290102389078498,\n \"acc_norm_stderr\": 0.01458677635529431\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5657239593706433,\n\
\ \"acc_stderr\": 0.004946485466544624,\n \"acc_norm\": 0.7467635929097789,\n\
\ \"acc_norm_stderr\": 0.0043397644342190655\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4074074074074074,\n\
\ \"acc_stderr\": 0.042446332383532286,\n \"acc_norm\": 0.4074074074074074,\n\
\ \"acc_norm_stderr\": 0.042446332383532286\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4868421052631579,\n \"acc_stderr\": 0.04067533136309174,\n\
\ \"acc_norm\": 0.4868421052631579,\n \"acc_norm_stderr\": 0.04067533136309174\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5849056603773585,\n \"acc_stderr\": 0.03032594578928611,\n\
\ \"acc_norm\": 0.5849056603773585,\n \"acc_norm_stderr\": 0.03032594578928611\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5277777777777778,\n\
\ \"acc_stderr\": 0.04174752578923185,\n \"acc_norm\": 0.5277777777777778,\n\
\ \"acc_norm_stderr\": 0.04174752578923185\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n\
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5375722543352601,\n\
\ \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.5375722543352601,\n\
\ \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929777,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929777\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4553191489361702,\n \"acc_stderr\": 0.03255525359340355,\n\
\ \"acc_norm\": 0.4553191489361702,\n \"acc_norm_stderr\": 0.03255525359340355\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n\
\ \"acc_stderr\": 0.04514496132873633,\n \"acc_norm\": 0.35964912280701755,\n\
\ \"acc_norm_stderr\": 0.04514496132873633\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.35714285714285715,\n \"acc_stderr\": 0.024677862841332783,\n \"\
acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.024677862841332783\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5870967741935483,\n\
\ \"acc_stderr\": 0.028009138125400387,\n \"acc_norm\": 0.5870967741935483,\n\
\ \"acc_norm_stderr\": 0.028009138125400387\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3497536945812808,\n \"acc_stderr\": 0.03355400904969565,\n\
\ \"acc_norm\": 0.3497536945812808,\n \"acc_norm_stderr\": 0.03355400904969565\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.036639749943912434,\n\
\ \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.036639749943912434\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6212121212121212,\n \"acc_stderr\": 0.03456088731993747,\n \"\
acc_norm\": 0.6212121212121212,\n \"acc_norm_stderr\": 0.03456088731993747\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7202072538860104,\n \"acc_stderr\": 0.03239637046735704,\n\
\ \"acc_norm\": 0.7202072538860104,\n \"acc_norm_stderr\": 0.03239637046735704\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4666666666666667,\n \"acc_stderr\": 0.025294608023986476,\n\
\ \"acc_norm\": 0.4666666666666667,\n \"acc_norm_stderr\": 0.025294608023986476\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145665,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145665\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.03242225027115007,\n\
\ \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.03242225027115007\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.710091743119266,\n \"acc_stderr\": 0.0194530666092016,\n \"acc_norm\"\
: 0.710091743119266,\n \"acc_norm_stderr\": 0.0194530666092016\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.37962962962962965,\n\
\ \"acc_stderr\": 0.03309682581119035,\n \"acc_norm\": 0.37962962962962965,\n\
\ \"acc_norm_stderr\": 0.03309682581119035\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03308611113236435,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03308611113236435\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6751054852320675,\n \"acc_stderr\": 0.030486039389105307,\n \
\ \"acc_norm\": 0.6751054852320675,\n \"acc_norm_stderr\": 0.030486039389105307\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6098654708520179,\n\
\ \"acc_stderr\": 0.03273766725459157,\n \"acc_norm\": 0.6098654708520179,\n\
\ \"acc_norm_stderr\": 0.03273766725459157\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5877862595419847,\n \"acc_stderr\": 0.04317171194870255,\n\
\ \"acc_norm\": 0.5877862595419847,\n \"acc_norm_stderr\": 0.04317171194870255\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6198347107438017,\n \"acc_stderr\": 0.04431324501968432,\n \"\
acc_norm\": 0.6198347107438017,\n \"acc_norm_stderr\": 0.04431324501968432\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6388888888888888,\n\
\ \"acc_stderr\": 0.04643454608906275,\n \"acc_norm\": 0.6388888888888888,\n\
\ \"acc_norm_stderr\": 0.04643454608906275\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6073619631901841,\n \"acc_stderr\": 0.03836740907831029,\n\
\ \"acc_norm\": 0.6073619631901841,\n \"acc_norm_stderr\": 0.03836740907831029\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280041,\n\
\ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280041\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.811965811965812,\n\
\ \"acc_stderr\": 0.02559819368665226,\n \"acc_norm\": 0.811965811965812,\n\
\ \"acc_norm_stderr\": 0.02559819368665226\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7113665389527458,\n\
\ \"acc_stderr\": 0.016203792703197793,\n \"acc_norm\": 0.7113665389527458,\n\
\ \"acc_norm_stderr\": 0.016203792703197793\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5404624277456648,\n \"acc_stderr\": 0.026830805998952236,\n\
\ \"acc_norm\": 0.5404624277456648,\n \"acc_norm_stderr\": 0.026830805998952236\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2737430167597765,\n\
\ \"acc_stderr\": 0.014912413096372432,\n \"acc_norm\": 0.2737430167597765,\n\
\ \"acc_norm_stderr\": 0.014912413096372432\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5620915032679739,\n \"acc_stderr\": 0.028408302020332694,\n\
\ \"acc_norm\": 0.5620915032679739,\n \"acc_norm_stderr\": 0.028408302020332694\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5980707395498392,\n\
\ \"acc_stderr\": 0.027846476005930473,\n \"acc_norm\": 0.5980707395498392,\n\
\ \"acc_norm_stderr\": 0.027846476005930473\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6141975308641975,\n \"acc_stderr\": 0.027085401226132143,\n\
\ \"acc_norm\": 0.6141975308641975,\n \"acc_norm_stderr\": 0.027085401226132143\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3617021276595745,\n \"acc_stderr\": 0.028663820147199492,\n \
\ \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.028663820147199492\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.38722294654498046,\n\
\ \"acc_stderr\": 0.012441155326854926,\n \"acc_norm\": 0.38722294654498046,\n\
\ \"acc_norm_stderr\": 0.012441155326854926\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4852941176470588,\n \"acc_stderr\": 0.030359697079046104,\n\
\ \"acc_norm\": 0.4852941176470588,\n \"acc_norm_stderr\": 0.030359697079046104\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5098039215686274,\n \"acc_stderr\": 0.0202239460050743,\n \
\ \"acc_norm\": 0.5098039215686274,\n \"acc_norm_stderr\": 0.0202239460050743\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.04653429807913507,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.04653429807913507\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6244897959183674,\n \"acc_stderr\": 0.03100120903989484,\n\
\ \"acc_norm\": 0.6244897959183674,\n \"acc_norm_stderr\": 0.03100120903989484\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7014925373134329,\n\
\ \"acc_stderr\": 0.03235743789355044,\n \"acc_norm\": 0.7014925373134329,\n\
\ \"acc_norm_stderr\": 0.03235743789355044\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n\
\ \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.43373493975903615,\n\
\ \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7017543859649122,\n \"acc_stderr\": 0.03508771929824563,\n\
\ \"acc_norm\": 0.7017543859649122,\n \"acc_norm_stderr\": 0.03508771929824563\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4112607099143207,\n\
\ \"mc1_stderr\": 0.01722562708366086,\n \"mc2\": 0.5936287801538656,\n\
\ \"mc2_stderr\": 0.015090925037000012\n }\n}\n```"
repo_url: https://huggingface.co/Severian/ANIMA-Phi-Neptune-Mistral-7B-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|arc:challenge|25_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hellaswag|10_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T14-57-20.867230.parquet'
- config_name: results
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- results_2023-10-10T14-57-20.867230.parquet
- split: latest
path:
- results_2023-10-10T14-57-20.867230.parquet
---
# Dataset Card for Evaluation run of Severian/ANIMA-Phi-Neptune-Mistral-7B-v1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Severian/ANIMA-Phi-Neptune-Mistral-7B-v1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Severian/ANIMA-Phi-Neptune-Mistral-7B-v1](https://huggingface.co/Severian/ANIMA-Phi-Neptune-Mistral-7B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-7B-v1",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T14:57:20.867230](https://huggingface.co/datasets/open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-7B-v1/blob/main/results_2023-10-10T14-57-20.867230.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5221924256666464,
"acc_stderr": 0.03497779761198706,
"acc_norm": 0.5257525929962562,
"acc_norm_stderr": 0.03496709701060229,
"mc1": 0.4112607099143207,
"mc1_stderr": 0.01722562708366086,
"mc2": 0.5936287801538656,
"mc2_stderr": 0.015090925037000012
},
"harness|arc:challenge|25": {
"acc": 0.5,
"acc_stderr": 0.014611390804670088,
"acc_norm": 0.5290102389078498,
"acc_norm_stderr": 0.01458677635529431
},
"harness|hellaswag|10": {
"acc": 0.5657239593706433,
"acc_stderr": 0.004946485466544624,
"acc_norm": 0.7467635929097789,
"acc_norm_stderr": 0.0043397644342190655
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.042446332383532286,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.042446332383532286
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4868421052631579,
"acc_stderr": 0.04067533136309174,
"acc_norm": 0.4868421052631579,
"acc_norm_stderr": 0.04067533136309174
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5849056603773585,
"acc_stderr": 0.03032594578928611,
"acc_norm": 0.5849056603773585,
"acc_norm_stderr": 0.03032594578928611
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.04174752578923185,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.04174752578923185
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5375722543352601,
"acc_stderr": 0.0380168510452446,
"acc_norm": 0.5375722543352601,
"acc_norm_stderr": 0.0380168510452446
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929777,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929777
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4553191489361702,
"acc_stderr": 0.03255525359340355,
"acc_norm": 0.4553191489361702,
"acc_norm_stderr": 0.03255525359340355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.04514496132873633,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.04514496132873633
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.024677862841332783,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.024677862841332783
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5870967741935483,
"acc_stderr": 0.028009138125400387,
"acc_norm": 0.5870967741935483,
"acc_norm_stderr": 0.028009138125400387
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3497536945812808,
"acc_stderr": 0.03355400904969565,
"acc_norm": 0.3497536945812808,
"acc_norm_stderr": 0.03355400904969565
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.036639749943912434,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.036639749943912434
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6212121212121212,
"acc_stderr": 0.03456088731993747,
"acc_norm": 0.6212121212121212,
"acc_norm_stderr": 0.03456088731993747
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7202072538860104,
"acc_stderr": 0.03239637046735704,
"acc_norm": 0.7202072538860104,
"acc_norm_stderr": 0.03239637046735704
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.025294608023986476,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.025294608023986476
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.027080372815145665,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.027080372815145665
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.03242225027115007,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.03242225027115007
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.710091743119266,
"acc_stderr": 0.0194530666092016,
"acc_norm": 0.710091743119266,
"acc_norm_stderr": 0.0194530666092016
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.37962962962962965,
"acc_stderr": 0.03309682581119035,
"acc_norm": 0.37962962962962965,
"acc_norm_stderr": 0.03309682581119035
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03308611113236435,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03308611113236435
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6751054852320675,
"acc_stderr": 0.030486039389105307,
"acc_norm": 0.6751054852320675,
"acc_norm_stderr": 0.030486039389105307
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6098654708520179,
"acc_stderr": 0.03273766725459157,
"acc_norm": 0.6098654708520179,
"acc_norm_stderr": 0.03273766725459157
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5877862595419847,
"acc_stderr": 0.04317171194870255,
"acc_norm": 0.5877862595419847,
"acc_norm_stderr": 0.04317171194870255
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6198347107438017,
"acc_stderr": 0.04431324501968432,
"acc_norm": 0.6198347107438017,
"acc_norm_stderr": 0.04431324501968432
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04643454608906275,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04643454608906275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6073619631901841,
"acc_stderr": 0.03836740907831029,
"acc_norm": 0.6073619631901841,
"acc_norm_stderr": 0.03836740907831029
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280041,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280041
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.811965811965812,
"acc_stderr": 0.02559819368665226,
"acc_norm": 0.811965811965812,
"acc_norm_stderr": 0.02559819368665226
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7113665389527458,
"acc_stderr": 0.016203792703197793,
"acc_norm": 0.7113665389527458,
"acc_norm_stderr": 0.016203792703197793
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5404624277456648,
"acc_stderr": 0.026830805998952236,
"acc_norm": 0.5404624277456648,
"acc_norm_stderr": 0.026830805998952236
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2737430167597765,
"acc_stderr": 0.014912413096372432,
"acc_norm": 0.2737430167597765,
"acc_norm_stderr": 0.014912413096372432
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5620915032679739,
"acc_stderr": 0.028408302020332694,
"acc_norm": 0.5620915032679739,
"acc_norm_stderr": 0.028408302020332694
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5980707395498392,
"acc_stderr": 0.027846476005930473,
"acc_norm": 0.5980707395498392,
"acc_norm_stderr": 0.027846476005930473
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6141975308641975,
"acc_stderr": 0.027085401226132143,
"acc_norm": 0.6141975308641975,
"acc_norm_stderr": 0.027085401226132143
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3617021276595745,
"acc_stderr": 0.028663820147199492,
"acc_norm": 0.3617021276595745,
"acc_norm_stderr": 0.028663820147199492
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.38722294654498046,
"acc_stderr": 0.012441155326854926,
"acc_norm": 0.38722294654498046,
"acc_norm_stderr": 0.012441155326854926
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4852941176470588,
"acc_stderr": 0.030359697079046104,
"acc_norm": 0.4852941176470588,
"acc_norm_stderr": 0.030359697079046104
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5098039215686274,
"acc_stderr": 0.0202239460050743,
"acc_norm": 0.5098039215686274,
"acc_norm_stderr": 0.0202239460050743
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.04653429807913507,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.04653429807913507
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6244897959183674,
"acc_stderr": 0.03100120903989484,
"acc_norm": 0.6244897959183674,
"acc_norm_stderr": 0.03100120903989484
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7014925373134329,
"acc_stderr": 0.03235743789355044,
"acc_norm": 0.7014925373134329,
"acc_norm_stderr": 0.03235743789355044
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-virology|5": {
"acc": 0.43373493975903615,
"acc_stderr": 0.03858158940685517,
"acc_norm": 0.43373493975903615,
"acc_norm_stderr": 0.03858158940685517
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7017543859649122,
"acc_stderr": 0.03508771929824563,
"acc_norm": 0.7017543859649122,
"acc_norm_stderr": 0.03508771929824563
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4112607099143207,
"mc1_stderr": 0.01722562708366086,
"mc2": 0.5936287801538656,
"mc2_stderr": 0.015090925037000012
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 65,091 | [
[
-0.050018310546875,
-0.0587158203125,
0.0202789306640625,
0.01398468017578125,
-0.01139068603515625,
-0.00560760498046875,
0.0024204254150390625,
-0.01195526123046875,
0.040863037109375,
-0.0005106925964355469,
-0.034210205078125,
-0.04718017578125,
-0.031585693... |
open-llm-leaderboard/details_llm-agents__tora-13b-v1.0 | 2023-10-29T07:05:14.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-10T15:17:27 | ---
pretty_name: Evaluation run of llm-agents/tora-13b-v1.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [llm-agents/tora-13b-v1.0](https://huggingface.co/llm-agents/tora-13b-v1.0) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_llm-agents__tora-13b-v1.0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-29T07:05:06.186132](https://huggingface.co/datasets/open-llm-leaderboard/details_llm-agents__tora-13b-v1.0/blob/main/results_2023-10-29T07-05-06.186132.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0024119127516778523,\n\
\ \"em_stderr\": 0.0005023380498893349,\n \"f1\": 0.06216652684563757,\n\
\ \"f1_stderr\": 0.0014129871021706449,\n \"acc\": 0.4273381630746787,\n\
\ \"acc_stderr\": 0.010139621814927263\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0024119127516778523,\n \"em_stderr\": 0.0005023380498893349,\n\
\ \"f1\": 0.06216652684563757,\n \"f1_stderr\": 0.0014129871021706449\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09855951478392722,\n \
\ \"acc_stderr\": 0.008210320350946335\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7561168113654302,\n \"acc_stderr\": 0.012068923278908189\n\
\ }\n}\n```"
repo_url: https://huggingface.co/llm-agents/tora-13b-v1.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|arc:challenge|25_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_29T06_57_18.434824
path:
- '**/details_harness|drop|3_2023-10-29T06-57-18.434824.parquet'
- split: 2023_10_29T07_05_06.186132
path:
- '**/details_harness|drop|3_2023-10-29T07-05-06.186132.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-29T07-05-06.186132.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_29T06_57_18.434824
path:
- '**/details_harness|gsm8k|5_2023-10-29T06-57-18.434824.parquet'
- split: 2023_10_29T07_05_06.186132
path:
- '**/details_harness|gsm8k|5_2023-10-29T07-05-06.186132.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-29T07-05-06.186132.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hellaswag|10_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_29T06_57_18.434824
path:
- '**/details_harness|winogrande|5_2023-10-29T06-57-18.434824.parquet'
- split: 2023_10_29T07_05_06.186132
path:
- '**/details_harness|winogrande|5_2023-10-29T07-05-06.186132.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-29T07-05-06.186132.parquet'
- config_name: results
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- results_2023-10-10T15-17-02.134278.parquet
- split: 2023_10_29T06_57_18.434824
path:
- results_2023-10-29T06-57-18.434824.parquet
- split: 2023_10_29T07_05_06.186132
path:
- results_2023-10-29T07-05-06.186132.parquet
- split: latest
path:
- results_2023-10-29T07-05-06.186132.parquet
---
# Dataset Card for Evaluation run of llm-agents/tora-13b-v1.0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/llm-agents/tora-13b-v1.0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [llm-agents/tora-13b-v1.0](https://huggingface.co/llm-agents/tora-13b-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_llm-agents__tora-13b-v1.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-29T07:05:06.186132](https://huggingface.co/datasets/open-llm-leaderboard/details_llm-agents__tora-13b-v1.0/blob/main/results_2023-10-29T07-05-06.186132.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0024119127516778523,
"em_stderr": 0.0005023380498893349,
"f1": 0.06216652684563757,
"f1_stderr": 0.0014129871021706449,
"acc": 0.4273381630746787,
"acc_stderr": 0.010139621814927263
},
"harness|drop|3": {
"em": 0.0024119127516778523,
"em_stderr": 0.0005023380498893349,
"f1": 0.06216652684563757,
"f1_stderr": 0.0014129871021706449
},
"harness|gsm8k|5": {
"acc": 0.09855951478392722,
"acc_stderr": 0.008210320350946335
},
"harness|winogrande|5": {
"acc": 0.7561168113654302,
"acc_stderr": 0.012068923278908189
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 39,071 | [
[
-0.0277099609375,
-0.046966552734375,
0.02337646484375,
0.021148681640625,
-0.01113128662109375,
0.0211029052734375,
-0.017547607421875,
-0.013397216796875,
0.03875732421875,
0.039886474609375,
-0.05645751953125,
-0.07012939453125,
-0.0469970703125,
0.020233... |
text2font/words_with_path_tags_version_2 | 2023-10-10T15:20:12.000Z | [
"region:us"
] | text2font | null | null | 0 | 0 | 2023-10-10T15:18:11 | Entry not found | 15 | [
[
-0.0213775634765625,
-0.01497650146484375,
0.05718994140625,
0.0288543701171875,
-0.0350341796875,
0.046478271484375,
0.052490234375,
0.005062103271484375,
0.051361083984375,
0.01702880859375,
-0.05206298828125,
-0.01494598388671875,
-0.06036376953125,
0.037... |
text2font/words_with_path_tags_version_2_splitted | 2023-10-10T15:20:24.000Z | [
"region:us"
] | text2font | null | null | 0 | 0 | 2023-10-10T15:19:35 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
open-llm-leaderboard/details_itsliupeng__llama2_7b_mmlu | 2023-10-25T10:05:33.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-10T15:25:46 | ---
pretty_name: Evaluation run of itsliupeng/llama2_7b_mmlu
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [itsliupeng/llama2_7b_mmlu](https://huggingface.co/itsliupeng/llama2_7b_mmlu)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_itsliupeng__llama2_7b_mmlu\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-25T10:05:20.920502](https://huggingface.co/datasets/open-llm-leaderboard/details_itsliupeng__llama2_7b_mmlu/blob/main/results_2023-10-25T10-05-20.920502.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0012583892617449664,\n\
\ \"em_stderr\": 0.0003630560893119021,\n \"f1\": 0.05594588926174501,\n\
\ \"f1_stderr\": 0.0013036425627808016,\n \"acc\": 0.41156271672651484,\n\
\ \"acc_stderr\": 0.009842322182656855\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.0003630560893119021,\n\
\ \"f1\": 0.05594588926174501,\n \"f1_stderr\": 0.0013036425627808016\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07884761182714177,\n \
\ \"acc_stderr\": 0.00742339051987324\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.744277821625888,\n \"acc_stderr\": 0.012261253845440473\n\
\ }\n}\n```"
repo_url: https://huggingface.co/itsliupeng/llama2_7b_mmlu
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|arc:challenge|25_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_25T10_05_20.920502
path:
- '**/details_harness|drop|3_2023-10-25T10-05-20.920502.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-25T10-05-20.920502.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_25T10_05_20.920502
path:
- '**/details_harness|gsm8k|5_2023-10-25T10-05-20.920502.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-25T10-05-20.920502.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hellaswag|10_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_25T10_05_20.920502
path:
- '**/details_harness|winogrande|5_2023-10-25T10-05-20.920502.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-25T10-05-20.920502.parquet'
- config_name: results
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- results_2023-10-10T15-25-23.413789.parquet
- split: 2023_10_25T10_05_20.920502
path:
- results_2023-10-25T10-05-20.920502.parquet
- split: latest
path:
- results_2023-10-25T10-05-20.920502.parquet
---
# Dataset Card for Evaluation run of itsliupeng/llama2_7b_mmlu
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/itsliupeng/llama2_7b_mmlu
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [itsliupeng/llama2_7b_mmlu](https://huggingface.co/itsliupeng/llama2_7b_mmlu) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_itsliupeng__llama2_7b_mmlu",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-25T10:05:20.920502](https://huggingface.co/datasets/open-llm-leaderboard/details_itsliupeng__llama2_7b_mmlu/blob/main/results_2023-10-25T10-05-20.920502.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0012583892617449664,
"em_stderr": 0.0003630560893119021,
"f1": 0.05594588926174501,
"f1_stderr": 0.0013036425627808016,
"acc": 0.41156271672651484,
"acc_stderr": 0.009842322182656855
},
"harness|drop|3": {
"em": 0.0012583892617449664,
"em_stderr": 0.0003630560893119021,
"f1": 0.05594588926174501,
"f1_stderr": 0.0013036425627808016
},
"harness|gsm8k|5": {
"acc": 0.07884761182714177,
"acc_stderr": 0.00742339051987324
},
"harness|winogrande|5": {
"acc": 0.744277821625888,
"acc_stderr": 0.012261253845440473
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,626 | [
[
-0.0280914306640625,
-0.04412841796875,
0.0178375244140625,
0.021759033203125,
-0.01399993896484375,
0.01076507568359375,
-0.0279693603515625,
-0.0119781494140625,
0.032196044921875,
0.03863525390625,
-0.05047607421875,
-0.06475830078125,
-0.050140380859375,
... |
open-llm-leaderboard/details_JosephusCheung__LL7M | 2023-10-24T03:13:36.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-10T15:27:13 | ---
pretty_name: Evaluation run of JosephusCheung/LL7M
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [JosephusCheung/LL7M](https://huggingface.co/JosephusCheung/LL7M) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_JosephusCheung__LL7M\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T03:13:24.379539](https://huggingface.co/datasets/open-llm-leaderboard/details_JosephusCheung__LL7M/blob/main/results_2023-10-24T03-13-24.379539.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.019819630872483222,\n\
\ \"em_stderr\": 0.0014273827117586067,\n \"f1\": 0.07556312919463118,\n\
\ \"f1_stderr\": 0.0018868261588306972,\n \"acc\": 0.3234745894051663,\n\
\ \"acc_stderr\": 0.007810892751790338\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.019819630872483222,\n \"em_stderr\": 0.0014273827117586067,\n\
\ \"f1\": 0.07556312919463118,\n \"f1_stderr\": 0.0018868261588306972\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.006065200909780136,\n \
\ \"acc_stderr\": 0.00213867030146044\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6408839779005525,\n \"acc_stderr\": 0.013483115202120236\n\
\ }\n}\n```"
repo_url: https://huggingface.co/JosephusCheung/LL7M
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|arc:challenge|25_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T03_13_24.379539
path:
- '**/details_harness|drop|3_2023-10-24T03-13-24.379539.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T03-13-24.379539.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T03_13_24.379539
path:
- '**/details_harness|gsm8k|5_2023-10-24T03-13-24.379539.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T03-13-24.379539.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hellaswag|10_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T03_13_24.379539
path:
- '**/details_harness|winogrande|5_2023-10-24T03-13-24.379539.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T03-13-24.379539.parquet'
- config_name: results
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- results_2023-10-10T15-26-54.562937.parquet
- split: 2023_10_24T03_13_24.379539
path:
- results_2023-10-24T03-13-24.379539.parquet
- split: latest
path:
- results_2023-10-24T03-13-24.379539.parquet
---
# Dataset Card for Evaluation run of JosephusCheung/LL7M
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/JosephusCheung/LL7M
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [JosephusCheung/LL7M](https://huggingface.co/JosephusCheung/LL7M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_JosephusCheung__LL7M",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T03:13:24.379539](https://huggingface.co/datasets/open-llm-leaderboard/details_JosephusCheung__LL7M/blob/main/results_2023-10-24T03-13-24.379539.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.019819630872483222,
"em_stderr": 0.0014273827117586067,
"f1": 0.07556312919463118,
"f1_stderr": 0.0018868261588306972,
"acc": 0.3234745894051663,
"acc_stderr": 0.007810892751790338
},
"harness|drop|3": {
"em": 0.019819630872483222,
"em_stderr": 0.0014273827117586067,
"f1": 0.07556312919463118,
"f1_stderr": 0.0018868261588306972
},
"harness|gsm8k|5": {
"acc": 0.006065200909780136,
"acc_stderr": 0.00213867030146044
},
"harness|winogrande|5": {
"acc": 0.6408839779005525,
"acc_stderr": 0.013483115202120236
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,552 | [
[
-0.0262603759765625,
-0.04473876953125,
0.01690673828125,
0.01537322998046875,
-0.01454925537109375,
0.00914764404296875,
-0.0302276611328125,
-0.013458251953125,
0.035552978515625,
0.04461669921875,
-0.04638671875,
-0.068603515625,
-0.052764892578125,
0.015... |
text2font/words_with_path_tags_version_2_train | 2023-10-10T20:31:19.000Z | [
"region:us"
] | text2font | null | null | 0 | 0 | 2023-10-10T15:29:11 | Entry not found | 15 | [
[
-0.021392822265625,
-0.01494598388671875,
0.05718994140625,
0.028839111328125,
-0.0350341796875,
0.046539306640625,
0.052490234375,
0.00507354736328125,
0.051361083984375,
0.01702880859375,
-0.052093505859375,
-0.01494598388671875,
-0.06036376953125,
0.03790... |
text2font/words_with_path_tags_version_2_valid | 2023-10-10T20:31:57.000Z | [
"region:us"
] | text2font | null | null | 0 | 0 | 2023-10-10T15:29:47 | Entry not found | 15 | [
[
-0.021392822265625,
-0.01494598388671875,
0.05718994140625,
0.028839111328125,
-0.0350341796875,
0.046539306640625,
0.052490234375,
0.00507354736328125,
0.051361083984375,
0.01702880859375,
-0.052093505859375,
-0.01494598388671875,
-0.06036376953125,
0.03790... |
text2font/words_with_path_tags_version_2_test | 2023-10-10T15:30:45.000Z | [
"region:us"
] | text2font | null | null | 0 | 0 | 2023-10-10T15:30:18 | Entry not found | 15 | [
[
-0.021392822265625,
-0.01494598388671875,
0.05718994140625,
0.028839111328125,
-0.0350341796875,
0.046539306640625,
0.052490234375,
0.00507354736328125,
0.051361083984375,
0.01702880859375,
-0.052093505859375,
-0.01494598388671875,
-0.06036376953125,
0.03790... |
nalmeida/test_local2 | 2023-10-10T15:40:10.000Z | [
"region:us"
] | nalmeida | null | null | 0 | 0 | 2023-10-10T15:36:57 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
open-llm-leaderboard/details_Lazycuber__L2-7b-Orca-WVG-Test | 2023-10-26T20:44:47.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-10T15:40:01 | ---
pretty_name: Evaluation run of Lazycuber/L2-7b-Orca-WVG-Test
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Lazycuber/L2-7b-Orca-WVG-Test](https://huggingface.co/Lazycuber/L2-7b-Orca-WVG-Test)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Lazycuber__L2-7b-Orca-WVG-Test\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-26T20:44:34.027885](https://huggingface.co/datasets/open-llm-leaderboard/details_Lazycuber__L2-7b-Orca-WVG-Test/blob/main/results_2023-10-26T20-44-34.027885.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.002202181208053691,\n\
\ \"em_stderr\": 0.00048005108166191996,\n \"f1\": 0.07443687080536941,\n\
\ \"f1_stderr\": 0.0016342523738966323,\n \"acc\": 0.4119262338489193,\n\
\ \"acc_stderr\": 0.009880953290999535\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.002202181208053691,\n \"em_stderr\": 0.00048005108166191996,\n\
\ \"f1\": 0.07443687080536941,\n \"f1_stderr\": 0.0016342523738966323\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0803639120545868,\n \
\ \"acc_stderr\": 0.007488258573239077\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7434885556432518,\n \"acc_stderr\": 0.01227364800875999\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Lazycuber/L2-7b-Orca-WVG-Test
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|arc:challenge|25_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_26T20_44_34.027885
path:
- '**/details_harness|drop|3_2023-10-26T20-44-34.027885.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-26T20-44-34.027885.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_26T20_44_34.027885
path:
- '**/details_harness|gsm8k|5_2023-10-26T20-44-34.027885.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-26T20-44-34.027885.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hellaswag|10_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_26T20_44_34.027885
path:
- '**/details_harness|winogrande|5_2023-10-26T20-44-34.027885.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-26T20-44-34.027885.parquet'
- config_name: results
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- results_2023-10-10T15-39-37.735727.parquet
- split: 2023_10_26T20_44_34.027885
path:
- results_2023-10-26T20-44-34.027885.parquet
- split: latest
path:
- results_2023-10-26T20-44-34.027885.parquet
---
# Dataset Card for Evaluation run of Lazycuber/L2-7b-Orca-WVG-Test
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Lazycuber/L2-7b-Orca-WVG-Test
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Lazycuber/L2-7b-Orca-WVG-Test](https://huggingface.co/Lazycuber/L2-7b-Orca-WVG-Test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Lazycuber__L2-7b-Orca-WVG-Test",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-26T20:44:34.027885](https://huggingface.co/datasets/open-llm-leaderboard/details_Lazycuber__L2-7b-Orca-WVG-Test/blob/main/results_2023-10-26T20-44-34.027885.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.002202181208053691,
"em_stderr": 0.00048005108166191996,
"f1": 0.07443687080536941,
"f1_stderr": 0.0016342523738966323,
"acc": 0.4119262338489193,
"acc_stderr": 0.009880953290999535
},
"harness|drop|3": {
"em": 0.002202181208053691,
"em_stderr": 0.00048005108166191996,
"f1": 0.07443687080536941,
"f1_stderr": 0.0016342523738966323
},
"harness|gsm8k|5": {
"acc": 0.0803639120545868,
"acc_stderr": 0.007488258573239077
},
"harness|winogrande|5": {
"acc": 0.7434885556432518,
"acc_stderr": 0.01227364800875999
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,672 | [
[
-0.0267333984375,
-0.061065673828125,
0.01317596435546875,
0.0117340087890625,
-0.01070404052734375,
0.0031185150146484375,
-0.025482177734375,
-0.0178985595703125,
0.0263671875,
0.035247802734375,
-0.0516357421875,
-0.06591796875,
-0.04815673828125,
0.00489... |
open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-LoRa | 2023-10-10T15:51:08.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-10T15:50:06 | ---
pretty_name: Evaluation run of Severian/ANIMA-Phi-Neptune-Mistral-LoRa
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Severian/ANIMA-Phi-Neptune-Mistral-LoRa](https://huggingface.co/Severian/ANIMA-Phi-Neptune-Mistral-LoRa)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-LoRa\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T15:49:43.201517](https://huggingface.co/datasets/open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-LoRa/blob/main/results_2023-10-10T15-49-43.201517.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5216421138363554,\n\
\ \"acc_stderr\": 0.034984720641748575,\n \"acc_norm\": 0.5252295168080844,\n\
\ \"acc_norm_stderr\": 0.03497398634134131,\n \"mc1\": 0.412484700122399,\n\
\ \"mc1_stderr\": 0.017233299399571227,\n \"mc2\": 0.5938354841447588,\n\
\ \"mc2_stderr\": 0.015090386269121684\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.014611390804670088,\n \
\ \"acc_norm\": 0.5307167235494881,\n \"acc_norm_stderr\": 0.014583792546304037\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5656243776140211,\n\
\ \"acc_stderr\": 0.004946617138983521,\n \"acc_norm\": 0.7465644293965346,\n\
\ \"acc_norm_stderr\": 0.004340891673320502\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4074074074074074,\n\
\ \"acc_stderr\": 0.042446332383532286,\n \"acc_norm\": 0.4074074074074074,\n\
\ \"acc_norm_stderr\": 0.042446332383532286\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4868421052631579,\n \"acc_stderr\": 0.04067533136309174,\n\
\ \"acc_norm\": 0.4868421052631579,\n \"acc_norm_stderr\": 0.04067533136309174\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5849056603773585,\n \"acc_stderr\": 0.03032594578928611,\n\
\ \"acc_norm\": 0.5849056603773585,\n \"acc_norm_stderr\": 0.03032594578928611\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5277777777777778,\n\
\ \"acc_stderr\": 0.04174752578923185,\n \"acc_norm\": 0.5277777777777778,\n\
\ \"acc_norm_stderr\": 0.04174752578923185\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n\
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5375722543352601,\n\
\ \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.5375722543352601,\n\
\ \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929777,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929777\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4553191489361702,\n \"acc_stderr\": 0.03255525359340355,\n\
\ \"acc_norm\": 0.4553191489361702,\n \"acc_norm_stderr\": 0.03255525359340355\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n\
\ \"acc_stderr\": 0.04514496132873633,\n \"acc_norm\": 0.35964912280701755,\n\
\ \"acc_norm_stderr\": 0.04514496132873633\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.35714285714285715,\n \"acc_stderr\": 0.024677862841332783,\n \"\
acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.024677862841332783\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5838709677419355,\n\
\ \"acc_stderr\": 0.028040981380761547,\n \"acc_norm\": 0.5838709677419355,\n\
\ \"acc_norm_stderr\": 0.028040981380761547\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3448275862068966,\n \"acc_stderr\": 0.033442837442804574,\n\
\ \"acc_norm\": 0.3448275862068966,\n \"acc_norm_stderr\": 0.033442837442804574\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.036639749943912434,\n\
\ \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.036639749943912434\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6212121212121212,\n \"acc_stderr\": 0.03456088731993747,\n \"\
acc_norm\": 0.6212121212121212,\n \"acc_norm_stderr\": 0.03456088731993747\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7202072538860104,\n \"acc_stderr\": 0.03239637046735704,\n\
\ \"acc_norm\": 0.7202072538860104,\n \"acc_norm_stderr\": 0.03239637046735704\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4641025641025641,\n \"acc_stderr\": 0.02528558599001784,\n \
\ \"acc_norm\": 0.4641025641025641,\n \"acc_norm_stderr\": 0.02528558599001784\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085626,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085626\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.03242225027115007,\n\
\ \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.03242225027115007\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.710091743119266,\n \"acc_stderr\": 0.0194530666092016,\n \"acc_norm\"\
: 0.710091743119266,\n \"acc_norm_stderr\": 0.0194530666092016\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.37962962962962965,\n\
\ \"acc_stderr\": 0.03309682581119035,\n \"acc_norm\": 0.37962962962962965,\n\
\ \"acc_norm_stderr\": 0.03309682581119035\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03308611113236435,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03308611113236435\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6751054852320675,\n \"acc_stderr\": 0.030486039389105307,\n \
\ \"acc_norm\": 0.6751054852320675,\n \"acc_norm_stderr\": 0.030486039389105307\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6053811659192825,\n\
\ \"acc_stderr\": 0.03280400504755291,\n \"acc_norm\": 0.6053811659192825,\n\
\ \"acc_norm_stderr\": 0.03280400504755291\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5877862595419847,\n \"acc_stderr\": 0.04317171194870255,\n\
\ \"acc_norm\": 0.5877862595419847,\n \"acc_norm_stderr\": 0.04317171194870255\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6198347107438017,\n \"acc_stderr\": 0.04431324501968432,\n \"\
acc_norm\": 0.6198347107438017,\n \"acc_norm_stderr\": 0.04431324501968432\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6388888888888888,\n\
\ \"acc_stderr\": 0.04643454608906275,\n \"acc_norm\": 0.6388888888888888,\n\
\ \"acc_norm_stderr\": 0.04643454608906275\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6134969325153374,\n \"acc_stderr\": 0.03825825548848607,\n\
\ \"acc_norm\": 0.6134969325153374,\n \"acc_norm_stderr\": 0.03825825548848607\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280041,\n\
\ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280041\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.811965811965812,\n\
\ \"acc_stderr\": 0.02559819368665226,\n \"acc_norm\": 0.811965811965812,\n\
\ \"acc_norm_stderr\": 0.02559819368665226\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7075351213282248,\n\
\ \"acc_stderr\": 0.016267000684598642,\n \"acc_norm\": 0.7075351213282248,\n\
\ \"acc_norm_stderr\": 0.016267000684598642\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5404624277456648,\n \"acc_stderr\": 0.026830805998952236,\n\
\ \"acc_norm\": 0.5404624277456648,\n \"acc_norm_stderr\": 0.026830805998952236\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2759776536312849,\n\
\ \"acc_stderr\": 0.014950103002475349,\n \"acc_norm\": 0.2759776536312849,\n\
\ \"acc_norm_stderr\": 0.014950103002475349\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5620915032679739,\n \"acc_stderr\": 0.028408302020332694,\n\
\ \"acc_norm\": 0.5620915032679739,\n \"acc_norm_stderr\": 0.028408302020332694\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5980707395498392,\n\
\ \"acc_stderr\": 0.027846476005930473,\n \"acc_norm\": 0.5980707395498392,\n\
\ \"acc_norm_stderr\": 0.027846476005930473\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6172839506172839,\n \"acc_stderr\": 0.027044538138402616,\n\
\ \"acc_norm\": 0.6172839506172839,\n \"acc_norm_stderr\": 0.027044538138402616\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3617021276595745,\n \"acc_stderr\": 0.028663820147199492,\n \
\ \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.028663820147199492\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.38852672750977835,\n\
\ \"acc_stderr\": 0.012448817838292355,\n \"acc_norm\": 0.38852672750977835,\n\
\ \"acc_norm_stderr\": 0.012448817838292355\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.47794117647058826,\n \"acc_stderr\": 0.030343264224213542,\n\
\ \"acc_norm\": 0.47794117647058826,\n \"acc_norm_stderr\": 0.030343264224213542\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.511437908496732,\n \"acc_stderr\": 0.020222541515610863,\n \
\ \"acc_norm\": 0.511437908496732,\n \"acc_norm_stderr\": 0.020222541515610863\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.04653429807913507,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.04653429807913507\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6204081632653061,\n \"acc_stderr\": 0.03106721126287248,\n\
\ \"acc_norm\": 0.6204081632653061,\n \"acc_norm_stderr\": 0.03106721126287248\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7014925373134329,\n\
\ \"acc_stderr\": 0.03235743789355044,\n \"acc_norm\": 0.7014925373134329,\n\
\ \"acc_norm_stderr\": 0.03235743789355044\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n\
\ \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.43373493975903615,\n\
\ \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7017543859649122,\n \"acc_stderr\": 0.03508771929824563,\n\
\ \"acc_norm\": 0.7017543859649122,\n \"acc_norm_stderr\": 0.03508771929824563\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.412484700122399,\n\
\ \"mc1_stderr\": 0.017233299399571227,\n \"mc2\": 0.5938354841447588,\n\
\ \"mc2_stderr\": 0.015090386269121684\n }\n}\n```"
repo_url: https://huggingface.co/Severian/ANIMA-Phi-Neptune-Mistral-LoRa
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|arc:challenge|25_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hellaswag|10_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T15-49-43.201517.parquet'
- config_name: results
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- results_2023-10-10T15-49-43.201517.parquet
- split: latest
path:
- results_2023-10-10T15-49-43.201517.parquet
---
# Dataset Card for Evaluation run of Severian/ANIMA-Phi-Neptune-Mistral-LoRa
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Severian/ANIMA-Phi-Neptune-Mistral-LoRa
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Severian/ANIMA-Phi-Neptune-Mistral-LoRa](https://huggingface.co/Severian/ANIMA-Phi-Neptune-Mistral-LoRa) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-LoRa",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T15:49:43.201517](https://huggingface.co/datasets/open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-LoRa/blob/main/results_2023-10-10T15-49-43.201517.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5216421138363554,
"acc_stderr": 0.034984720641748575,
"acc_norm": 0.5252295168080844,
"acc_norm_stderr": 0.03497398634134131,
"mc1": 0.412484700122399,
"mc1_stderr": 0.017233299399571227,
"mc2": 0.5938354841447588,
"mc2_stderr": 0.015090386269121684
},
"harness|arc:challenge|25": {
"acc": 0.5,
"acc_stderr": 0.014611390804670088,
"acc_norm": 0.5307167235494881,
"acc_norm_stderr": 0.014583792546304037
},
"harness|hellaswag|10": {
"acc": 0.5656243776140211,
"acc_stderr": 0.004946617138983521,
"acc_norm": 0.7465644293965346,
"acc_norm_stderr": 0.004340891673320502
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.042446332383532286,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.042446332383532286
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4868421052631579,
"acc_stderr": 0.04067533136309174,
"acc_norm": 0.4868421052631579,
"acc_norm_stderr": 0.04067533136309174
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5849056603773585,
"acc_stderr": 0.03032594578928611,
"acc_norm": 0.5849056603773585,
"acc_norm_stderr": 0.03032594578928611
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.04174752578923185,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.04174752578923185
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5375722543352601,
"acc_stderr": 0.0380168510452446,
"acc_norm": 0.5375722543352601,
"acc_norm_stderr": 0.0380168510452446
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929777,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929777
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4553191489361702,
"acc_stderr": 0.03255525359340355,
"acc_norm": 0.4553191489361702,
"acc_norm_stderr": 0.03255525359340355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.04514496132873633,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.04514496132873633
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.024677862841332783,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.024677862841332783
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5838709677419355,
"acc_stderr": 0.028040981380761547,
"acc_norm": 0.5838709677419355,
"acc_norm_stderr": 0.028040981380761547
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3448275862068966,
"acc_stderr": 0.033442837442804574,
"acc_norm": 0.3448275862068966,
"acc_norm_stderr": 0.033442837442804574
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.036639749943912434,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.036639749943912434
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6212121212121212,
"acc_stderr": 0.03456088731993747,
"acc_norm": 0.6212121212121212,
"acc_norm_stderr": 0.03456088731993747
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7202072538860104,
"acc_stderr": 0.03239637046735704,
"acc_norm": 0.7202072538860104,
"acc_norm_stderr": 0.03239637046735704
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4641025641025641,
"acc_stderr": 0.02528558599001784,
"acc_norm": 0.4641025641025641,
"acc_norm_stderr": 0.02528558599001784
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085626,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085626
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.03242225027115007,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.03242225027115007
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.710091743119266,
"acc_stderr": 0.0194530666092016,
"acc_norm": 0.710091743119266,
"acc_norm_stderr": 0.0194530666092016
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.37962962962962965,
"acc_stderr": 0.03309682581119035,
"acc_norm": 0.37962962962962965,
"acc_norm_stderr": 0.03309682581119035
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03308611113236435,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03308611113236435
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6751054852320675,
"acc_stderr": 0.030486039389105307,
"acc_norm": 0.6751054852320675,
"acc_norm_stderr": 0.030486039389105307
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6053811659192825,
"acc_stderr": 0.03280400504755291,
"acc_norm": 0.6053811659192825,
"acc_norm_stderr": 0.03280400504755291
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5877862595419847,
"acc_stderr": 0.04317171194870255,
"acc_norm": 0.5877862595419847,
"acc_norm_stderr": 0.04317171194870255
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6198347107438017,
"acc_stderr": 0.04431324501968432,
"acc_norm": 0.6198347107438017,
"acc_norm_stderr": 0.04431324501968432
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04643454608906275,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04643454608906275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6134969325153374,
"acc_stderr": 0.03825825548848607,
"acc_norm": 0.6134969325153374,
"acc_norm_stderr": 0.03825825548848607
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280041,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280041
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.811965811965812,
"acc_stderr": 0.02559819368665226,
"acc_norm": 0.811965811965812,
"acc_norm_stderr": 0.02559819368665226
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7075351213282248,
"acc_stderr": 0.016267000684598642,
"acc_norm": 0.7075351213282248,
"acc_norm_stderr": 0.016267000684598642
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5404624277456648,
"acc_stderr": 0.026830805998952236,
"acc_norm": 0.5404624277456648,
"acc_norm_stderr": 0.026830805998952236
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2759776536312849,
"acc_stderr": 0.014950103002475349,
"acc_norm": 0.2759776536312849,
"acc_norm_stderr": 0.014950103002475349
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5620915032679739,
"acc_stderr": 0.028408302020332694,
"acc_norm": 0.5620915032679739,
"acc_norm_stderr": 0.028408302020332694
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5980707395498392,
"acc_stderr": 0.027846476005930473,
"acc_norm": 0.5980707395498392,
"acc_norm_stderr": 0.027846476005930473
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6172839506172839,
"acc_stderr": 0.027044538138402616,
"acc_norm": 0.6172839506172839,
"acc_norm_stderr": 0.027044538138402616
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3617021276595745,
"acc_stderr": 0.028663820147199492,
"acc_norm": 0.3617021276595745,
"acc_norm_stderr": 0.028663820147199492
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.38852672750977835,
"acc_stderr": 0.012448817838292355,
"acc_norm": 0.38852672750977835,
"acc_norm_stderr": 0.012448817838292355
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.47794117647058826,
"acc_stderr": 0.030343264224213542,
"acc_norm": 0.47794117647058826,
"acc_norm_stderr": 0.030343264224213542
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.511437908496732,
"acc_stderr": 0.020222541515610863,
"acc_norm": 0.511437908496732,
"acc_norm_stderr": 0.020222541515610863
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.04653429807913507,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.04653429807913507
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6204081632653061,
"acc_stderr": 0.03106721126287248,
"acc_norm": 0.6204081632653061,
"acc_norm_stderr": 0.03106721126287248
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7014925373134329,
"acc_stderr": 0.03235743789355044,
"acc_norm": 0.7014925373134329,
"acc_norm_stderr": 0.03235743789355044
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-virology|5": {
"acc": 0.43373493975903615,
"acc_stderr": 0.03858158940685517,
"acc_norm": 0.43373493975903615,
"acc_norm_stderr": 0.03858158940685517
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7017543859649122,
"acc_stderr": 0.03508771929824563,
"acc_norm": 0.7017543859649122,
"acc_norm_stderr": 0.03508771929824563
},
"harness|truthfulqa:mc|0": {
"mc1": 0.412484700122399,
"mc1_stderr": 0.017233299399571227,
"mc2": 0.5938354841447588,
"mc2_stderr": 0.015090386269121684
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 65,085 | [
[
-0.050048828125,
-0.060516357421875,
0.02032470703125,
0.0133209228515625,
-0.01052093505859375,
-0.006076812744140625,
0.0024814605712890625,
-0.012725830078125,
0.041748046875,
-0.0011463165283203125,
-0.033172607421875,
-0.047332763671875,
-0.0309600830078125... |
borggAI/prompting-10102023 | 2023-10-10T15:57:45.000Z | [
"region:us"
] | borggAI | null | null | 0 | 0 | 2023-10-10T15:56:27 | Entry not found | 15 | [
[
-0.0214080810546875,
-0.01494598388671875,
0.057159423828125,
0.028839111328125,
-0.0350341796875,
0.04656982421875,
0.052490234375,
0.00504302978515625,
0.0513916015625,
0.016998291015625,
-0.0521240234375,
-0.0149993896484375,
-0.06036376953125,
0.03790283... |
nalmeida/test2 | 2023-10-10T16:03:36.000Z | [
"region:us"
] | nalmeida | null | null | 0 | 0 | 2023-10-10T16:03:34 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: pregunta
dtype: string
- name: respuesta
dtype: string
splits:
- name: train
num_bytes: 498
num_examples: 5
download_size: 2175
dataset_size: 498
---
# Dataset Card for "test2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 461 | [
[
-0.0309906005859375,
-0.0191497802734375,
0.006938934326171875,
0.0117034912109375,
-0.007495880126953125,
-0.0035648345947265625,
0.0229034423828125,
-0.0162811279296875,
0.033905029296875,
0.015655517578125,
-0.048980712890625,
-0.034027099609375,
-0.040039062... |
reach-vb/random-audios | 2023-10-12T14:17:44.000Z | [
"region:us"
] | reach-vb | null | null | 1 | 0 | 2023-10-10T16:08:26 | Entry not found | 15 | [
[
-0.021392822265625,
-0.01494598388671875,
0.05718994140625,
0.028839111328125,
-0.0350341796875,
0.046539306640625,
0.052490234375,
0.00507354736328125,
0.051361083984375,
0.01702880859375,
-0.052093505859375,
-0.01494598388671875,
-0.06036376953125,
0.03790... |
MrB141107/MrB141107 | 2023-10-10T16:13:39.000Z | [
"region:us"
] | MrB141107 | null | null | 0 | 0 | 2023-10-10T16:13:39 | Entry not found | 15 | [
[
-0.021392822265625,
-0.01494598388671875,
0.05718994140625,
0.028839111328125,
-0.0350341796875,
0.046539306640625,
0.052490234375,
0.00507354736328125,
0.051361083984375,
0.01702880859375,
-0.052093505859375,
-0.01494598388671875,
-0.06036376953125,
0.03790... |
marianna13/litarch | 2023-10-29T18:18:57.000Z | [
"language:en",
"license:cc",
"region:us"
] | marianna13 | null | null | 0 | 0 | 2023-10-10T16:15:58 | ---
license: cc
language:
- en
---
Textbooks from [PubChem Literature Archive](https://ftp.ncbi.nlm.nih.gov/pub/litarch/).
# Image-Text Pairs
```
[
['litarch_figures/ca/84/gene_NBK1116/angelmanF1.jpg', '\nIndividuals depicted have a genetically confirmed diagnosis of Angelman syndrome. Happy expression and an unstable gait accompanied by uplifted arms are commonly observed. At times, the facial appearance can suggest the diagnosis, but usually facial features are not distinctive.\n', ''],
['litarch_figures/ca/84/gene_NBK1116/angelmanF2.jpg', '\nSchematic drawing of chromosome region 15q11.2-q13 indicating the breakpoint regions BP1-BP6. Low copy repeat elements are located within these breakpoint regions (see text for details). Approximately 90% of chromosome deletions resulting in Angelman syndrome initiate at BP1 or BP2 and terminate in region BP3 (class I and class II). Approximately 10% of deletions are larger, typically spanning from BP1 to BP5, rarely beyond BP5. Genes that are not imprinted and thus biparentally expressed are noted by the open circles. The two critical imprinting center (IC) elements, the AS-SRO and the PWS-SRO, are drawn as open boxes. The gene SNRUF-SNRPN, drawn as a shaded box, has some overlap with the PWS-SRO. The SNURF-SNRPN sense/UBE3A antisense transcript is labeled UBE3A-AS.\n', ''],
['litarch_figures/ca/84/gene_NBK1116/angelmanF3.jpg', '\nThe pedigree illustrates imprinting inheritance in Angelman syndrome (AS). Inheritance of a deleterious UBE3A pathogenic variant from the male (top left, I-1) has no effect on the two children (II-2, II-4) who inherit his pathogenic variant because the mutated UBE3A has already been inactivated in his germ cells (i.e., by imprinting) and because each of these children also inherited a normally activated UBE3A from their mother (I-2). (Note: Only one active UBE3A allele is required for normal brain functioning.) If his carrier daughter (II-2) transmits the UBE3A pathogenic variant to the grandson and granddaughter (III-1, III-2), they both will have AS since each will have also inherited an inactivated UBE3A from their father; thus, neither child will express a UBE3A allele. The same explanation pertains for AS occurring in the great grand-niece (bottom right, IV-2).\n', '']
]
```
# Interleaved:
```
[
["Getting by with the bare minimum seems to be the modus operandi of Mycobacterium leprae \u2014 the causal agent of leprosy. Its genome sequence reveals that it has undergone massive genome 'downsizing' over time, discarding more than half its genes and rendering it the most striking example of genome reduction in a microbial pathogen."],
["The leprosy bacillus is famed for being the first microorganism definitively shown to be associated with human disease. It evades the host's immune response by invading and propagating inside the vacuoles of macrophages called phagosomes. From there, it infects the Schwann cells of the peripheral nervous system, where it disrupts myelin production, thus leading to the characteristic features of leprosy, which include skin lesions and sensory loss."],
["litarch_figures/df/45/coffeebrk_NBK2345/A559.jpg",
"\nProtein coding genes distribution map for Mycobacterium leprae.\nThe leprosy bacillus genome contains numerous examples of gene deletion and decay. The relative locations of various genes in the genome are depicted in the map above. Protein coding genes are color coded in the map according to their classification within clusters of orthologous groups (COGs) functional categories. COGs represent proteins or groups of paralogs that are found in at least 3 phylogenetically-distant genomes. For more information about COGs, see Science 1997 Oct 24:278(5338):631-7.\n\n",
""],
["Protein coding genes distribution map for Mycobacterium leprae."]
]
```
# Text
```
"Getting by with the bare minimum seems to be the modus operandi of Mycobacterium leprae \u2014 the causal agent of leprosy. Its genome sequence reveals that it has undergone massive genome 'downsizing' over time, discarding more than half its genes and rendering it the most striking example of genome reduction in a microbial pathogen.\nThe leprosy bacillus is famed for being the first microorganism definitively shown to be associated with human disease. It evades the host's immune response by invading and propagating inside the vacuoles of macrophages called phagosomes. From there, it infects the Schwann cells of the peripheral nervous system, where it disrupts myelin production, thus leading to the characteristic features of leprosy, which include skin lesions and sensory loss... "
``` | 4,634 | [
[
-0.0430908203125,
-0.049407958984375,
0.036224365234375,
-0.013885498046875,
0.00582122802734375,
-0.018463134765625,
0.029876708984375,
-0.0243682861328125,
0.058868408203125,
0.0594482421875,
-0.03570556640625,
-0.0282440185546875,
-0.036407470703125,
0.02... |
davanstrien/test_card | 2023-10-11T01:41:06.000Z | [
"region:us"
] | davanstrien | null | null | 0 | 0 | 2023-10-10T16:23:45 | ---
dataset_info:
features:
- name: id
dtype: string
- name: lastModified
dtype: string
- name: tags
sequence: string
- name: author
dtype: string
- name: description
dtype: string
- name: citation
dtype: string
- name: cardData
dtype: 'null'
- name: likes
dtype: int64
- name: downloads
dtype: int64
- name: card
dtype: string
splits:
- name: train
num_bytes: 203107730
num_examples: 69309
download_size: 52854496
dataset_size: 203107730
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "test_card"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 770 | [
[
-0.044830322265625,
-0.024017333984375,
0.00670623779296875,
0.01065826416015625,
-0.01062774658203125,
-0.0006818771362304688,
0.01702880859375,
-0.0059814453125,
0.051300048828125,
0.023345947265625,
-0.053802490234375,
-0.04986572265625,
-0.03411865234375,
... |
Diesertikel/nils | 2023-10-10T16:31:40.000Z | [
"region:us"
] | Diesertikel | null | null | 0 | 0 | 2023-10-10T16:31:39 | Entry not found | 15 | [
[
-0.021392822265625,
-0.01494598388671875,
0.05718994140625,
0.028839111328125,
-0.0350341796875,
0.046539306640625,
0.052490234375,
0.00507354736328125,
0.051361083984375,
0.01702880859375,
-0.052093505859375,
-0.01494598388671875,
-0.06036376953125,
0.03790... |
deadbits/vigil-gandalf-instruction-bypass-ada-002 | 2023-10-10T19:07:03.000Z | [
"embeddings",
"text",
"security",
"region:us"
] | deadbits | null | null | 0 | 0 | 2023-10-10T16:34:45 | ---
tags:
- embeddings
- text
- security
pretty_name: 'Vigil: LLM Gandalf Instruction Bypass text-embedding-ada-002'
---
# Vigil: LLM Gandalf Instruction Bypass text-embedding-ada-002
- **Repo:** [github.com/deadbits/vigil-llm](https://github.com/deadbits/vigil-llm)
`Vigil` is a Python framework and REST API for assessing Large Language Model (LLM) prompts against a set of scanners to detect prompt injections, jailbreaks, and other potentially risky inputs.
This repository contains `text-embedding-ada-002` embeddings for the [Lakera Gandalf "Ignore Instructions" dataset](https://huggingface.co/datasets/Lakera/gandalf_ignore_instructions).
All prompts from the original dataset have been lowercased before embedding.
You can use the [parquet2vdb.py](https://github.com/deadbits/prompt-injection-defense/blob/main/vigil/utils/parquet2vdb.py) utility to load the embeddings in the Vigil chromadb instance, or use them in your own application.
## Format
```json
[
{
"text": str,
"embedding": [],
"model": "text-embedding-ada-002"
}
]
```
**Original dataset:** https://huggingface.co/datasets/Lakera/gandalf_ignore_instructions
```
@InProceedings{gandalf_ignore_instructions,
title = {gandalf_ignore_instructions},
author={Lakera AI (https://www.lakera.ai)},
year={2023}
}
``` | 1,307 | [
[
-0.0022373199462890625,
-0.08087158203125,
0.046478271484375,
0.0209197998046875,
-0.024444580078125,
0.005886077880859375,
-0.0025119781494140625,
-0.01151275634765625,
0.009307861328125,
0.032379150390625,
-0.03228759765625,
-0.0728759765625,
-0.04885864257812... |
autoevaluate/autoeval-eval-acronym_identification-default-35e977-94268146033 | 2023-10-10T16:50:56.000Z | [
"region:us"
] | autoevaluate | null | null | 0 | 0 | 2023-10-10T16:50:53 | Entry not found | 15 | [
[
-0.021392822265625,
-0.01494598388671875,
0.05718994140625,
0.028839111328125,
-0.0350341796875,
0.046539306640625,
0.052490234375,
0.00507354736328125,
0.051361083984375,
0.01702880859375,
-0.052093505859375,
-0.01494598388671875,
-0.06036376953125,
0.03790... |
open-llm-leaderboard/details_lizpreciatior__lzlv_70b_fp16_hf | 2023-10-24T11:08:31.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-10T17:25:55 | ---
pretty_name: Evaluation run of lizpreciatior/lzlv_70b_fp16_hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lizpreciatior/lzlv_70b_fp16_hf](https://huggingface.co/lizpreciatior/lzlv_70b_fp16_hf)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lizpreciatior__lzlv_70b_fp16_hf\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T11:08:18.401041](https://huggingface.co/datasets/open-llm-leaderboard/details_lizpreciatior__lzlv_70b_fp16_hf/blob/main/results_2023-10-24T11-08-18.401041.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.040058724832214766,\n\
\ \"em_stderr\": 0.002008216561907643,\n \"f1\": 0.10676174496644267,\n\
\ \"f1_stderr\": 0.002328625422990624,\n \"acc\": 0.5717896950225979,\n\
\ \"acc_stderr\": 0.011591305235224383\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.040058724832214766,\n \"em_stderr\": 0.002008216561907643,\n\
\ \"f1\": 0.10676174496644267,\n \"f1_stderr\": 0.002328625422990624\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.30932524639878695,\n \
\ \"acc_stderr\": 0.012731710925078124\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8342541436464088,\n \"acc_stderr\": 0.010450899545370642\n\
\ }\n}\n```"
repo_url: https://huggingface.co/lizpreciatior/lzlv_70b_fp16_hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|arc:challenge|25_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T11_08_18.401041
path:
- '**/details_harness|drop|3_2023-10-24T11-08-18.401041.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T11-08-18.401041.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T11_08_18.401041
path:
- '**/details_harness|gsm8k|5_2023-10-24T11-08-18.401041.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T11-08-18.401041.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hellaswag|10_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T11_08_18.401041
path:
- '**/details_harness|winogrande|5_2023-10-24T11-08-18.401041.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T11-08-18.401041.parquet'
- config_name: results
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- results_2023-10-10T17-25-31.421123.parquet
- split: 2023_10_24T11_08_18.401041
path:
- results_2023-10-24T11-08-18.401041.parquet
- split: latest
path:
- results_2023-10-24T11-08-18.401041.parquet
---
# Dataset Card for Evaluation run of lizpreciatior/lzlv_70b_fp16_hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/lizpreciatior/lzlv_70b_fp16_hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [lizpreciatior/lzlv_70b_fp16_hf](https://huggingface.co/lizpreciatior/lzlv_70b_fp16_hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lizpreciatior__lzlv_70b_fp16_hf",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T11:08:18.401041](https://huggingface.co/datasets/open-llm-leaderboard/details_lizpreciatior__lzlv_70b_fp16_hf/blob/main/results_2023-10-24T11-08-18.401041.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.040058724832214766,
"em_stderr": 0.002008216561907643,
"f1": 0.10676174496644267,
"f1_stderr": 0.002328625422990624,
"acc": 0.5717896950225979,
"acc_stderr": 0.011591305235224383
},
"harness|drop|3": {
"em": 0.040058724832214766,
"em_stderr": 0.002008216561907643,
"f1": 0.10676174496644267,
"f1_stderr": 0.002328625422990624
},
"harness|gsm8k|5": {
"acc": 0.30932524639878695,
"acc_stderr": 0.012731710925078124
},
"harness|winogrande|5": {
"acc": 0.8342541436464088,
"acc_stderr": 0.010450899545370642
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,676 | [
[
-0.0274200439453125,
-0.04766845703125,
0.01534271240234375,
0.01428985595703125,
-0.008575439453125,
0.0007939338684082031,
-0.027587890625,
-0.01395416259765625,
0.02972412109375,
0.043731689453125,
-0.05523681640625,
-0.06622314453125,
-0.047943115234375,
... |
open-llm-leaderboard/details_Doctor-Shotgun__mythospice-limarp-70b | 2023-10-25T01:07:41.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-10T17:32:34 | ---
pretty_name: Evaluation run of Doctor-Shotgun/mythospice-limarp-70b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Doctor-Shotgun/mythospice-limarp-70b](https://huggingface.co/Doctor-Shotgun/mythospice-limarp-70b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Doctor-Shotgun__mythospice-limarp-70b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-25T01:07:28.245203](https://huggingface.co/datasets/open-llm-leaderboard/details_Doctor-Shotgun__mythospice-limarp-70b/blob/main/results_2023-10-25T01-07-28.245203.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.04771392617449664,\n\
\ \"em_stderr\": 0.002182960840414587,\n \"f1\": 0.11594274328859033,\n\
\ \"f1_stderr\": 0.00247314456935574,\n \"acc\": 0.5746822740673767,\n\
\ \"acc_stderr\": 0.01174970000558032\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.04771392617449664,\n \"em_stderr\": 0.002182960840414587,\n\
\ \"f1\": 0.11594274328859033,\n \"f1_stderr\": 0.00247314456935574\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.32221379833206976,\n \
\ \"acc_stderr\": 0.01287243548118878\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8271507498026835,\n \"acc_stderr\": 0.010626964529971859\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Doctor-Shotgun/mythospice-limarp-70b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|arc:challenge|25_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_25T01_07_28.245203
path:
- '**/details_harness|drop|3_2023-10-25T01-07-28.245203.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-25T01-07-28.245203.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_25T01_07_28.245203
path:
- '**/details_harness|gsm8k|5_2023-10-25T01-07-28.245203.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-25T01-07-28.245203.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hellaswag|10_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_25T01_07_28.245203
path:
- '**/details_harness|winogrande|5_2023-10-25T01-07-28.245203.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-25T01-07-28.245203.parquet'
- config_name: results
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- results_2023-10-10T17-32-09.949446.parquet
- split: 2023_10_25T01_07_28.245203
path:
- results_2023-10-25T01-07-28.245203.parquet
- split: latest
path:
- results_2023-10-25T01-07-28.245203.parquet
---
# Dataset Card for Evaluation run of Doctor-Shotgun/mythospice-limarp-70b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Doctor-Shotgun/mythospice-limarp-70b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Doctor-Shotgun/mythospice-limarp-70b](https://huggingface.co/Doctor-Shotgun/mythospice-limarp-70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Doctor-Shotgun__mythospice-limarp-70b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-25T01:07:28.245203](https://huggingface.co/datasets/open-llm-leaderboard/details_Doctor-Shotgun__mythospice-limarp-70b/blob/main/results_2023-10-25T01-07-28.245203.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.04771392617449664,
"em_stderr": 0.002182960840414587,
"f1": 0.11594274328859033,
"f1_stderr": 0.00247314456935574,
"acc": 0.5746822740673767,
"acc_stderr": 0.01174970000558032
},
"harness|drop|3": {
"em": 0.04771392617449664,
"em_stderr": 0.002182960840414587,
"f1": 0.11594274328859033,
"f1_stderr": 0.00247314456935574
},
"harness|gsm8k|5": {
"acc": 0.32221379833206976,
"acc_stderr": 0.01287243548118878
},
"harness|winogrande|5": {
"acc": 0.8271507498026835,
"acc_stderr": 0.010626964529971859
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,736 | [
[
-0.025299072265625,
-0.046722412109375,
0.0166015625,
0.008209228515625,
-0.01422119140625,
0.01032257080078125,
-0.0226287841796875,
-0.01496124267578125,
0.031890869140625,
0.041046142578125,
-0.047515869140625,
-0.07464599609375,
-0.0577392578125,
0.01750... |
open-llm-leaderboard/details_Doctor-Shotgun__mythospice-70b | 2023-10-24T21:51:56.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-10T17:34:31 | ---
pretty_name: Evaluation run of Doctor-Shotgun/mythospice-70b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Doctor-Shotgun/mythospice-70b](https://huggingface.co/Doctor-Shotgun/mythospice-70b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Doctor-Shotgun__mythospice-70b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T21:51:42.689346](https://huggingface.co/datasets/open-llm-leaderboard/details_Doctor-Shotgun__mythospice-70b/blob/main/results_2023-10-24T21-51-42.689346.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.002726510067114094,\n\
\ \"em_stderr\": 0.0005340111700415905,\n \"f1\": 0.06940331375838925,\n\
\ \"f1_stderr\": 0.0014269735757716981,\n \"acc\": 0.5668306034144879,\n\
\ \"acc_stderr\": 0.011562556636019638\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.002726510067114094,\n \"em_stderr\": 0.0005340111700415905,\n\
\ \"f1\": 0.06940331375838925,\n \"f1_stderr\": 0.0014269735757716981\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3009855951478393,\n \
\ \"acc_stderr\": 0.012634504465211199\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8326756116811366,\n \"acc_stderr\": 0.010490608806828079\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Doctor-Shotgun/mythospice-70b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|arc:challenge|25_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T21_51_42.689346
path:
- '**/details_harness|drop|3_2023-10-24T21-51-42.689346.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T21-51-42.689346.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T21_51_42.689346
path:
- '**/details_harness|gsm8k|5_2023-10-24T21-51-42.689346.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T21-51-42.689346.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hellaswag|10_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T21_51_42.689346
path:
- '**/details_harness|winogrande|5_2023-10-24T21-51-42.689346.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T21-51-42.689346.parquet'
- config_name: results
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- results_2023-10-10T17-34-08.268208.parquet
- split: 2023_10_24T21_51_42.689346
path:
- results_2023-10-24T21-51-42.689346.parquet
- split: latest
path:
- results_2023-10-24T21-51-42.689346.parquet
---
# Dataset Card for Evaluation run of Doctor-Shotgun/mythospice-70b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Doctor-Shotgun/mythospice-70b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Doctor-Shotgun/mythospice-70b](https://huggingface.co/Doctor-Shotgun/mythospice-70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Doctor-Shotgun__mythospice-70b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T21:51:42.689346](https://huggingface.co/datasets/open-llm-leaderboard/details_Doctor-Shotgun__mythospice-70b/blob/main/results_2023-10-24T21-51-42.689346.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.002726510067114094,
"em_stderr": 0.0005340111700415905,
"f1": 0.06940331375838925,
"f1_stderr": 0.0014269735757716981,
"acc": 0.5668306034144879,
"acc_stderr": 0.011562556636019638
},
"harness|drop|3": {
"em": 0.002726510067114094,
"em_stderr": 0.0005340111700415905,
"f1": 0.06940331375838925,
"f1_stderr": 0.0014269735757716981
},
"harness|gsm8k|5": {
"acc": 0.3009855951478393,
"acc_stderr": 0.012634504465211199
},
"harness|winogrande|5": {
"acc": 0.8326756116811366,
"acc_stderr": 0.010490608806828079
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,670 | [
[
-0.02508544921875,
-0.04632568359375,
0.0173797607421875,
0.004955291748046875,
-0.0127105712890625,
0.01161956787109375,
-0.0227203369140625,
-0.01535797119140625,
0.032073974609375,
0.039794921875,
-0.049041748046875,
-0.0748291015625,
-0.0562744140625,
0.... |
Satooo123/embauche | 2023-10-10T17:48:52.000Z | [
"region:us"
] | Satooo123 | null | null | 0 | 0 | 2023-10-10T17:48:52 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
namthai-dev/my-dataset | 2023-10-10T18:24:38.000Z | [
"region:us"
] | namthai-dev | null | null | 0 | 0 | 2023-10-10T18:24:38 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
PTMalatji/mC4SA | 2023-10-10T18:45:39.000Z | [
"region:us"
] | PTMalatji | null | null | 0 | 0 | 2023-10-10T18:45:39 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
open-llm-leaderboard/details_ICBU-NPU__FashionGPT-70B-V1.2 | 2023-10-28T11:30:38.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-10T19:14:44 | ---
pretty_name: Evaluation run of ICBU-NPU/FashionGPT-70B-V1.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ICBU-NPU/FashionGPT-70B-V1.2](https://huggingface.co/ICBU-NPU/FashionGPT-70B-V1.2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ICBU-NPU__FashionGPT-70B-V1.2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-28T11:30:26.266910](https://huggingface.co/datasets/open-llm-leaderboard/details_ICBU-NPU__FashionGPT-70B-V1.2/blob/main/results_2023-10-28T11-30-26.266910.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.007235738255033557,\n\
\ \"em_stderr\": 0.00086796885701786,\n \"f1\": 0.08897126677852359,\n\
\ \"f1_stderr\": 0.0016572567969813893,\n \"acc\": 0.5329529019437246,\n\
\ \"acc_stderr\": 0.011217384303167696\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.007235738255033557,\n \"em_stderr\": 0.00086796885701786,\n\
\ \"f1\": 0.08897126677852359,\n \"f1_stderr\": 0.0016572567969813893\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2403335860500379,\n \
\ \"acc_stderr\": 0.01176958070383695\n },\n \"harness|winogrande|5\":\
\ {\n \"acc\": 0.8255722178374112,\n \"acc_stderr\": 0.010665187902498444\n\
\ }\n}\n```"
repo_url: https://huggingface.co/ICBU-NPU/FashionGPT-70B-V1.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|arc:challenge|25_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_28T11_30_26.266910
path:
- '**/details_harness|drop|3_2023-10-28T11-30-26.266910.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-28T11-30-26.266910.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_28T11_30_26.266910
path:
- '**/details_harness|gsm8k|5_2023-10-28T11-30-26.266910.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-28T11-30-26.266910.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hellaswag|10_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_28T11_30_26.266910
path:
- '**/details_harness|winogrande|5_2023-10-28T11-30-26.266910.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-28T11-30-26.266910.parquet'
- config_name: results
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- results_2023-10-10T19-14-20.366315.parquet
- split: 2023_10_28T11_30_26.266910
path:
- results_2023-10-28T11-30-26.266910.parquet
- split: latest
path:
- results_2023-10-28T11-30-26.266910.parquet
---
# Dataset Card for Evaluation run of ICBU-NPU/FashionGPT-70B-V1.2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ICBU-NPU/FashionGPT-70B-V1.2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ICBU-NPU/FashionGPT-70B-V1.2](https://huggingface.co/ICBU-NPU/FashionGPT-70B-V1.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ICBU-NPU__FashionGPT-70B-V1.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-28T11:30:26.266910](https://huggingface.co/datasets/open-llm-leaderboard/details_ICBU-NPU__FashionGPT-70B-V1.2/blob/main/results_2023-10-28T11-30-26.266910.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.007235738255033557,
"em_stderr": 0.00086796885701786,
"f1": 0.08897126677852359,
"f1_stderr": 0.0016572567969813893,
"acc": 0.5329529019437246,
"acc_stderr": 0.011217384303167696
},
"harness|drop|3": {
"em": 0.007235738255033557,
"em_stderr": 0.00086796885701786,
"f1": 0.08897126677852359,
"f1_stderr": 0.0016572567969813893
},
"harness|gsm8k|5": {
"acc": 0.2403335860500379,
"acc_stderr": 0.01176958070383695
},
"harness|winogrande|5": {
"acc": 0.8255722178374112,
"acc_stderr": 0.010665187902498444
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,649 | [
[
-0.032135009765625,
-0.049041748046875,
0.0089874267578125,
0.02093505859375,
-0.020355224609375,
0.013702392578125,
-0.0234832763671875,
-0.0171661376953125,
0.0277557373046875,
0.031494140625,
-0.05230712890625,
-0.06536865234375,
-0.04754638671875,
0.0099... |
crimelab357/autotrain-data-n3xz-uu07-5e7x | 2023-10-10T19:15:36.000Z | [
"region:us"
] | crimelab357 | null | null | 0 | 0 | 2023-10-10T19:15:33 | Entry not found | 15 | [
[
-0.02142333984375,
-0.014984130859375,
0.057220458984375,
0.0288238525390625,
-0.03509521484375,
0.04656982421875,
0.052520751953125,
0.00506591796875,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060455322265625,
0.03793334... |
open-llm-leaderboard/details_euclaise__falcon_1b_stage3_2 | 2023-10-24T01:21:22.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-10T19:39:49 | ---
pretty_name: Evaluation run of euclaise/falcon_1b_stage3_2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [euclaise/falcon_1b_stage3_2](https://huggingface.co/euclaise/falcon_1b_stage3_2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_euclaise__falcon_1b_stage3_2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T01:21:10.323077](https://huggingface.co/datasets/open-llm-leaderboard/details_euclaise__falcon_1b_stage3_2/blob/main/results_2023-10-24T01-21-10.323077.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.15541107382550334,\n\
\ \"em_stderr\": 0.0037102512781803346,\n \"f1\": 0.2052170721476513,\n\
\ \"f1_stderr\": 0.0038511842764741527,\n \"acc\": 0.3022888713496448,\n\
\ \"acc_stderr\": 0.006870839193772672\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.15541107382550334,\n \"em_stderr\": 0.0037102512781803346,\n\
\ \"f1\": 0.2052170721476513,\n \"f1_stderr\": 0.0038511842764741527\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6045777426992897,\n\
\ \"acc_stderr\": 0.013741678387545343\n }\n}\n```"
repo_url: https://huggingface.co/euclaise/falcon_1b_stage3_2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|arc:challenge|25_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T01_21_10.323077
path:
- '**/details_harness|drop|3_2023-10-24T01-21-10.323077.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T01-21-10.323077.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T01_21_10.323077
path:
- '**/details_harness|gsm8k|5_2023-10-24T01-21-10.323077.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T01-21-10.323077.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hellaswag|10_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T01_21_10.323077
path:
- '**/details_harness|winogrande|5_2023-10-24T01-21-10.323077.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T01-21-10.323077.parquet'
- config_name: results
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- results_2023-10-10T19-39-31.631601.parquet
- split: 2023_10_24T01_21_10.323077
path:
- results_2023-10-24T01-21-10.323077.parquet
- split: latest
path:
- results_2023-10-24T01-21-10.323077.parquet
---
# Dataset Card for Evaluation run of euclaise/falcon_1b_stage3_2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/euclaise/falcon_1b_stage3_2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [euclaise/falcon_1b_stage3_2](https://huggingface.co/euclaise/falcon_1b_stage3_2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_euclaise__falcon_1b_stage3_2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T01:21:10.323077](https://huggingface.co/datasets/open-llm-leaderboard/details_euclaise__falcon_1b_stage3_2/blob/main/results_2023-10-24T01-21-10.323077.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.15541107382550334,
"em_stderr": 0.0037102512781803346,
"f1": 0.2052170721476513,
"f1_stderr": 0.0038511842764741527,
"acc": 0.3022888713496448,
"acc_stderr": 0.006870839193772672
},
"harness|drop|3": {
"em": 0.15541107382550334,
"em_stderr": 0.0037102512781803346,
"f1": 0.2052170721476513,
"f1_stderr": 0.0038511842764741527
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.6045777426992897,
"acc_stderr": 0.013741678387545343
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,569 | [
[
-0.032196044921875,
-0.04876708984375,
0.01471710205078125,
0.0191650390625,
-0.00684356689453125,
0.007648468017578125,
-0.023956298828125,
-0.01517486572265625,
0.03338623046875,
0.042022705078125,
-0.057281494140625,
-0.0673828125,
-0.05169677734375,
0.01... |
open-llm-leaderboard/details_llm-agents__tora-code-34b-v1.0 | 2023-10-29T14:45:48.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-10T19:59:10 | ---
pretty_name: Evaluation run of llm-agents/tora-code-34b-v1.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [llm-agents/tora-code-34b-v1.0](https://huggingface.co/llm-agents/tora-code-34b-v1.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_llm-agents__tora-code-34b-v1.0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-29T14:45:33.469419](https://huggingface.co/datasets/open-llm-leaderboard/details_llm-agents__tora-code-34b-v1.0/blob/main/results_2023-10-29T14-45-33.469419.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0012583892617449664,\n\
\ \"em_stderr\": 0.00036305608931189816,\n \"f1\": 0.04579802852349004,\n\
\ \"f1_stderr\": 0.0010433016886932766,\n \"acc\": 0.40654288933581384,\n\
\ \"acc_stderr\": 0.011193892157736274\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.00036305608931189816,\n\
\ \"f1\": 0.04579802852349004,\n \"f1_stderr\": 0.0010433016886932766\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.13115996967399546,\n \
\ \"acc_stderr\": 0.009298499235587867\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6819258089976322,\n \"acc_stderr\": 0.013089285079884681\n\
\ }\n}\n```"
repo_url: https://huggingface.co/llm-agents/tora-code-34b-v1.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|arc:challenge|25_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_29T14_45_33.469419
path:
- '**/details_harness|drop|3_2023-10-29T14-45-33.469419.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-29T14-45-33.469419.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_29T14_45_33.469419
path:
- '**/details_harness|gsm8k|5_2023-10-29T14-45-33.469419.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-29T14-45-33.469419.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hellaswag|10_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_29T14_45_33.469419
path:
- '**/details_harness|winogrande|5_2023-10-29T14-45-33.469419.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-29T14-45-33.469419.parquet'
- config_name: results
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- results_2023-10-10T19-58-46.874384.parquet
- split: 2023_10_29T14_45_33.469419
path:
- results_2023-10-29T14-45-33.469419.parquet
- split: latest
path:
- results_2023-10-29T14-45-33.469419.parquet
---
# Dataset Card for Evaluation run of llm-agents/tora-code-34b-v1.0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/llm-agents/tora-code-34b-v1.0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [llm-agents/tora-code-34b-v1.0](https://huggingface.co/llm-agents/tora-code-34b-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_llm-agents__tora-code-34b-v1.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-29T14:45:33.469419](https://huggingface.co/datasets/open-llm-leaderboard/details_llm-agents__tora-code-34b-v1.0/blob/main/results_2023-10-29T14-45-33.469419.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0012583892617449664,
"em_stderr": 0.00036305608931189816,
"f1": 0.04579802852349004,
"f1_stderr": 0.0010433016886932766,
"acc": 0.40654288933581384,
"acc_stderr": 0.011193892157736274
},
"harness|drop|3": {
"em": 0.0012583892617449664,
"em_stderr": 0.00036305608931189816,
"f1": 0.04579802852349004,
"f1_stderr": 0.0010433016886932766
},
"harness|gsm8k|5": {
"acc": 0.13115996967399546,
"acc_stderr": 0.009298499235587867
},
"harness|winogrande|5": {
"acc": 0.6819258089976322,
"acc_stderr": 0.013089285079884681
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,682 | [
[
-0.0277099609375,
-0.04339599609375,
0.02081298828125,
0.019744873046875,
-0.01076507568359375,
0.021148681640625,
-0.0196380615234375,
-0.01218414306640625,
0.03521728515625,
0.04156494140625,
-0.05242919921875,
-0.0694580078125,
-0.0458984375,
0.0191040039... |
open-llm-leaderboard/details_Delcos__Mistral-Pygmalion-7b | 2023-10-28T13:05:37.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 1 | 0 | 2023-10-10T20:14:41 | ---
pretty_name: Evaluation run of Delcos/Mistral-Pygmalion-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Delcos/Mistral-Pygmalion-7b](https://huggingface.co/Delcos/Mistral-Pygmalion-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Delcos__Mistral-Pygmalion-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-28T13:05:25.339926](https://huggingface.co/datasets/open-llm-leaderboard/details_Delcos__Mistral-Pygmalion-7b/blob/main/results_2023-10-28T13-05-25.339926.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001363255033557047,\n\
\ \"em_stderr\": 0.0003778609196460774,\n \"f1\": 0.05936241610738259,\n\
\ \"f1_stderr\": 0.0013656193493625718,\n \"acc\": 0.41059662883495607,\n\
\ \"acc_stderr\": 0.009533380943461503\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.001363255033557047,\n \"em_stderr\": 0.0003778609196460774,\n\
\ \"f1\": 0.05936241610738259,\n \"f1_stderr\": 0.0013656193493625718\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.06823351023502654,\n \
\ \"acc_stderr\": 0.006945358944067431\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7529597474348856,\n \"acc_stderr\": 0.012121402942855575\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Delcos/Mistral-Pygmalion-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|arc:challenge|25_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_28T13_05_25.339926
path:
- '**/details_harness|drop|3_2023-10-28T13-05-25.339926.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-28T13-05-25.339926.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_28T13_05_25.339926
path:
- '**/details_harness|gsm8k|5_2023-10-28T13-05-25.339926.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-28T13-05-25.339926.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hellaswag|10_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_28T13_05_25.339926
path:
- '**/details_harness|winogrande|5_2023-10-28T13-05-25.339926.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-28T13-05-25.339926.parquet'
- config_name: results
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- results_2023-10-10T20-14-17.715432.parquet
- split: 2023_10_28T13_05_25.339926
path:
- results_2023-10-28T13-05-25.339926.parquet
- split: latest
path:
- results_2023-10-28T13-05-25.339926.parquet
---
# Dataset Card for Evaluation run of Delcos/Mistral-Pygmalion-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Delcos/Mistral-Pygmalion-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Delcos/Mistral-Pygmalion-7b](https://huggingface.co/Delcos/Mistral-Pygmalion-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Delcos__Mistral-Pygmalion-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-28T13:05:25.339926](https://huggingface.co/datasets/open-llm-leaderboard/details_Delcos__Mistral-Pygmalion-7b/blob/main/results_2023-10-28T13-05-25.339926.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001363255033557047,
"em_stderr": 0.0003778609196460774,
"f1": 0.05936241610738259,
"f1_stderr": 0.0013656193493625718,
"acc": 0.41059662883495607,
"acc_stderr": 0.009533380943461503
},
"harness|drop|3": {
"em": 0.001363255033557047,
"em_stderr": 0.0003778609196460774,
"f1": 0.05936241610738259,
"f1_stderr": 0.0013656193493625718
},
"harness|gsm8k|5": {
"acc": 0.06823351023502654,
"acc_stderr": 0.006945358944067431
},
"harness|winogrande|5": {
"acc": 0.7529597474348856,
"acc_stderr": 0.012121402942855575
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,650 | [
[
-0.02880859375,
-0.04571533203125,
0.00933074951171875,
0.0126190185546875,
-0.0162353515625,
0.004764556884765625,
-0.027618408203125,
-0.0165557861328125,
0.0308074951171875,
0.032989501953125,
-0.04779052734375,
-0.06414794921875,
-0.054443359375,
0.01759... |
HuggingSander/todo | 2023-10-10T20:21:23.000Z | [
"region:us"
] | HuggingSander | null | null | 0 | 0 | 2023-10-10T20:21:23 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
xFrisky02/artur | 2023-10-31T13:36:18.000Z | [
"region:us"
] | xFrisky02 | null | null | 0 | 0 | 2023-10-10T20:48:24 | Entry not found | 15 | [
[
-0.02142333984375,
-0.014984130859375,
0.057220458984375,
0.0288238525390625,
-0.03509521484375,
0.04656982421875,
0.052520751953125,
0.00506591796875,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060455322265625,
0.03793334... |
nadsoft/ara-sample | 2023-10-10T21:21:30.000Z | [
"region:us"
] | nadsoft | null | null | 0 | 0 | 2023-10-10T21:12:02 | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: text
dtype: string
splits:
- name: train
num_bytes: 16610581.0
num_examples: 108
download_size: 15605783
dataset_size: 16610581.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# test | 319 | [
[
-0.02606201171875,
-0.037445068359375,
0.0176849365234375,
0.0244140625,
-0.034454345703125,
0.01129150390625,
0.023834228515625,
0.0182037353515625,
0.006725311279296875,
0.03790283203125,
-0.024383544921875,
-0.01137542724609375,
-0.03857421875,
0.02136230... |
osbm/unet-explainer-data | 2023-10-27T04:18:47.000Z | [
"region:us"
] | osbm | null | null | 1 | 0 | 2023-10-10T21:37:36 | Entry not found | 15 | [
[
-0.02142333984375,
-0.014984130859375,
0.057220458984375,
0.0288238525390625,
-0.03509521484375,
0.04656982421875,
0.052520751953125,
0.00506591796875,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060455322265625,
0.03793334... |
open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-1.3B | 2023-10-25T06:55:10.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-10T21:37:49 | ---
pretty_name: Evaluation run of princeton-nlp/Sheared-LLaMA-1.3B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [princeton-nlp/Sheared-LLaMA-1.3B](https://huggingface.co/princeton-nlp/Sheared-LLaMA-1.3B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-1.3B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-25T06:54:58.430499](https://huggingface.co/datasets/open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-1.3B/blob/main/results_2023-10-25T06-54-58.430499.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0014681208053691276,\n\
\ \"em_stderr\": 0.0003921042190298358,\n \"f1\": 0.045623951342281956,\n\
\ \"f1_stderr\": 0.0012088045479754918,\n \"acc\": 0.2954867628904967,\n\
\ \"acc_stderr\": 0.007847263403599461\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0014681208053691276,\n \"em_stderr\": 0.0003921042190298358,\n\
\ \"f1\": 0.045623951342281956,\n \"f1_stderr\": 0.0012088045479754918\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.004548900682335102,\n \
\ \"acc_stderr\": 0.0018535550440036204\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5864246250986582,\n \"acc_stderr\": 0.013840971763195303\n\
\ }\n}\n```"
repo_url: https://huggingface.co/princeton-nlp/Sheared-LLaMA-1.3B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|arc:challenge|25_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_25T06_54_58.430499
path:
- '**/details_harness|drop|3_2023-10-25T06-54-58.430499.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-25T06-54-58.430499.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_25T06_54_58.430499
path:
- '**/details_harness|gsm8k|5_2023-10-25T06-54-58.430499.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-25T06-54-58.430499.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hellaswag|10_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_25T06_54_58.430499
path:
- '**/details_harness|winogrande|5_2023-10-25T06-54-58.430499.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-25T06-54-58.430499.parquet'
- config_name: results
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- results_2023-10-10T21-37-25.489785.parquet
- split: 2023_10_25T06_54_58.430499
path:
- results_2023-10-25T06-54-58.430499.parquet
- split: latest
path:
- results_2023-10-25T06-54-58.430499.parquet
---
# Dataset Card for Evaluation run of princeton-nlp/Sheared-LLaMA-1.3B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/princeton-nlp/Sheared-LLaMA-1.3B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [princeton-nlp/Sheared-LLaMA-1.3B](https://huggingface.co/princeton-nlp/Sheared-LLaMA-1.3B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-1.3B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-25T06:54:58.430499](https://huggingface.co/datasets/open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-1.3B/blob/main/results_2023-10-25T06-54-58.430499.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0014681208053691276,
"em_stderr": 0.0003921042190298358,
"f1": 0.045623951342281956,
"f1_stderr": 0.0012088045479754918,
"acc": 0.2954867628904967,
"acc_stderr": 0.007847263403599461
},
"harness|drop|3": {
"em": 0.0014681208053691276,
"em_stderr": 0.0003921042190298358,
"f1": 0.045623951342281956,
"f1_stderr": 0.0012088045479754918
},
"harness|gsm8k|5": {
"acc": 0.004548900682335102,
"acc_stderr": 0.0018535550440036204
},
"harness|winogrande|5": {
"acc": 0.5864246250986582,
"acc_stderr": 0.013840971763195303
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,720 | [
[
-0.0235443115234375,
-0.049041748046875,
0.01335906982421875,
0.0291748046875,
-0.0140838623046875,
0.0129241943359375,
-0.0301513671875,
-0.0197296142578125,
0.037506103515625,
0.038360595703125,
-0.0450439453125,
-0.06988525390625,
-0.051300048828125,
0.02... |
open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-2.7B | 2023-10-25T00:56:50.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-10T21:43:06 | ---
pretty_name: Evaluation run of princeton-nlp/Sheared-LLaMA-2.7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [princeton-nlp/Sheared-LLaMA-2.7B](https://huggingface.co/princeton-nlp/Sheared-LLaMA-2.7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-2.7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-25T00:56:37.125068](https://huggingface.co/datasets/open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-2.7B/blob/main/results_2023-10-25T00-56-37.125068.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0009437919463087249,\n\
\ \"em_stderr\": 0.00031446531194131934,\n \"f1\": 0.045700503355704955,\n\
\ \"f1_stderr\": 0.0011710591235602548,\n \"acc\": 0.34035046042510264,\n\
\ \"acc_stderr\": 0.008018572932452629\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0009437919463087249,\n \"em_stderr\": 0.00031446531194131934,\n\
\ \"f1\": 0.045700503355704955,\n \"f1_stderr\": 0.0011710591235602548\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01061410159211524,\n \
\ \"acc_stderr\": 0.002822713322387704\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.67008681925809,\n \"acc_stderr\": 0.013214432542517555\n\
\ }\n}\n```"
repo_url: https://huggingface.co/princeton-nlp/Sheared-LLaMA-2.7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|arc:challenge|25_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_25T00_56_37.125068
path:
- '**/details_harness|drop|3_2023-10-25T00-56-37.125068.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-25T00-56-37.125068.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_25T00_56_37.125068
path:
- '**/details_harness|gsm8k|5_2023-10-25T00-56-37.125068.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-25T00-56-37.125068.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hellaswag|10_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_25T00_56_37.125068
path:
- '**/details_harness|winogrande|5_2023-10-25T00-56-37.125068.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-25T00-56-37.125068.parquet'
- config_name: results
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- results_2023-10-10T21-42-42.589642.parquet
- split: 2023_10_25T00_56_37.125068
path:
- results_2023-10-25T00-56-37.125068.parquet
- split: latest
path:
- results_2023-10-25T00-56-37.125068.parquet
---
# Dataset Card for Evaluation run of princeton-nlp/Sheared-LLaMA-2.7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/princeton-nlp/Sheared-LLaMA-2.7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [princeton-nlp/Sheared-LLaMA-2.7B](https://huggingface.co/princeton-nlp/Sheared-LLaMA-2.7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-2.7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-25T00:56:37.125068](https://huggingface.co/datasets/open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-2.7B/blob/main/results_2023-10-25T00-56-37.125068.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0009437919463087249,
"em_stderr": 0.00031446531194131934,
"f1": 0.045700503355704955,
"f1_stderr": 0.0011710591235602548,
"acc": 0.34035046042510264,
"acc_stderr": 0.008018572932452629
},
"harness|drop|3": {
"em": 0.0009437919463087249,
"em_stderr": 0.00031446531194131934,
"f1": 0.045700503355704955,
"f1_stderr": 0.0011710591235602548
},
"harness|gsm8k|5": {
"acc": 0.01061410159211524,
"acc_stderr": 0.002822713322387704
},
"harness|winogrande|5": {
"acc": 0.67008681925809,
"acc_stderr": 0.013214432542517555
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,718 | [
[
-0.0224151611328125,
-0.049346923828125,
0.012054443359375,
0.0277557373046875,
-0.01361083984375,
0.01413726806640625,
-0.032501220703125,
-0.0198974609375,
0.036834716796875,
0.0386962890625,
-0.0438232421875,
-0.070068359375,
-0.051849365234375,
0.0203094... |
autoevaluate/autoeval-eval-acronym_identification-default-d4ab15-94341146060 | 2023-10-10T22:16:34.000Z | [
"region:us"
] | autoevaluate | null | null | 0 | 0 | 2023-10-10T22:16:30 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
totallyrealaccount/Rvrgt | 2023-10-10T22:31:51.000Z | [
"region:us"
] | totallyrealaccount | null | null | 0 | 0 | 2023-10-10T22:27:51 | Entry not found | 15 | [
[
-0.0214080810546875,
-0.01496124267578125,
0.057159423828125,
0.02880859375,
-0.0350341796875,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.05206298828125,
-0.01497650146484375,
-0.060302734375,
0.0379638... |
open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-7B | 2023-10-29T10:04:27.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-10T23:09:36 | ---
pretty_name: Evaluation run of Severian/ANIMA-Phi-Neptune-Mistral-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Severian/ANIMA-Phi-Neptune-Mistral-7B](https://huggingface.co/Severian/ANIMA-Phi-Neptune-Mistral-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-29T10:04:15.191273](https://huggingface.co/datasets/open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-7B/blob/main/results_2023-10-29T10-04-15.191273.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.10329278523489933,\n\
\ \"em_stderr\": 0.003116735713102519,\n \"f1\": 0.1624748322147643,\n\
\ \"f1_stderr\": 0.003266242273162539,\n \"acc\": 0.442081101118795,\n\
\ \"acc_stderr\": 0.011112320094960076\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.10329278523489933,\n \"em_stderr\": 0.003116735713102519,\n\
\ \"f1\": 0.1624748322147643,\n \"f1_stderr\": 0.003266242273162539\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.14935557240333586,\n \
\ \"acc_stderr\": 0.009818090723727293\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7348066298342542,\n \"acc_stderr\": 0.01240654946619286\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Severian/ANIMA-Phi-Neptune-Mistral-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|arc:challenge|25_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_29T10_04_15.191273
path:
- '**/details_harness|drop|3_2023-10-29T10-04-15.191273.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-29T10-04-15.191273.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_29T10_04_15.191273
path:
- '**/details_harness|gsm8k|5_2023-10-29T10-04-15.191273.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-29T10-04-15.191273.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hellaswag|10_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_29T10_04_15.191273
path:
- '**/details_harness|winogrande|5_2023-10-29T10-04-15.191273.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-29T10-04-15.191273.parquet'
- config_name: results
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- results_2023-10-10T23-09-12.843992.parquet
- split: 2023_10_29T10_04_15.191273
path:
- results_2023-10-29T10-04-15.191273.parquet
- split: latest
path:
- results_2023-10-29T10-04-15.191273.parquet
---
# Dataset Card for Evaluation run of Severian/ANIMA-Phi-Neptune-Mistral-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Severian/ANIMA-Phi-Neptune-Mistral-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Severian/ANIMA-Phi-Neptune-Mistral-7B](https://huggingface.co/Severian/ANIMA-Phi-Neptune-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-29T10:04:15.191273](https://huggingface.co/datasets/open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-7B/blob/main/results_2023-10-29T10-04-15.191273.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.10329278523489933,
"em_stderr": 0.003116735713102519,
"f1": 0.1624748322147643,
"f1_stderr": 0.003266242273162539,
"acc": 0.442081101118795,
"acc_stderr": 0.011112320094960076
},
"harness|drop|3": {
"em": 0.10329278523489933,
"em_stderr": 0.003116735713102519,
"f1": 0.1624748322147643,
"f1_stderr": 0.003266242273162539
},
"harness|gsm8k|5": {
"acc": 0.14935557240333586,
"acc_stderr": 0.009818090723727293
},
"harness|winogrande|5": {
"acc": 0.7348066298342542,
"acc_stderr": 0.01240654946619286
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,748 | [
[
-0.031219482421875,
-0.04962158203125,
0.019287109375,
0.015716552734375,
-0.01172637939453125,
0.0028171539306640625,
-0.02557373046875,
-0.00830078125,
0.0352783203125,
0.046539306640625,
-0.052703857421875,
-0.06634521484375,
-0.048553466796875,
0.0142822... |
cadaeic/2242_samples_synthesized_recipe_squad_dataset | 2023-10-10T23:48:47.000Z | [
"region:us"
] | cadaeic | null | null | 0 | 0 | 2023-10-10T23:39:51 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
maxColten/TK | 2023-10-10T23:47:40.000Z | [
"region:us"
] | maxColten | null | null | 0 | 0 | 2023-10-10T23:47:40 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
wilayna/support | 2023-10-11T00:08:17.000Z | [
"region:us"
] | wilayna | null | null | 0 | 0 | 2023-10-10T23:53:22 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
W1lson/Book3 | 2023-10-11T04:28:53.000Z | [
"region:us"
] | W1lson | null | null | 0 | 0 | 2023-10-11T00:09:20 | ---
dataset_info:
features:
- name: Source ID
dtype: int64
- name: Primary Text
dtype: string
splits:
- name: train
num_bytes: 9831
num_examples: 87
download_size: 7549
dataset_size: 9831
---
# Dataset Card for "Book3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 381 | [
[
-0.0374755859375,
-0.003772735595703125,
0.01514434814453125,
0.0095672607421875,
-0.007293701171875,
-0.0159912109375,
0.033721923828125,
-0.01763916015625,
0.0299530029296875,
0.04473876953125,
-0.05194091796875,
-0.057708740234375,
-0.033905029296875,
-0.... |
kimsiun/clinical_trial_eligibility_crietria_recommendation | 2023-10-11T04:30:51.000Z | [
"license:mit",
"region:us"
] | kimsiun | null | null | 0 | 0 | 2023-10-11T00:36:07 | ---
license: mit
---
This repository is a public repository of the data used in the paper "CReSE: Enhancing Clinical Trial Design via Contrastive Learning and Rephrasing-based and Clinical Relevance-preserving Sentence Embedding" (under review).
There are three main types of data stored in the repository.
1) Positive-negative EC-title pairs: A dataset that pairs the ECs used in a study with the study's title and other design information. It can be used to train EC recommendation models (binary classification). Different datasets are available in terms of the input type of trial information and the number of ECs in the trial.
- For example, a file named "train_pairs_positive_inputtype_only_title.p" means positive pair data collected using only trial title as the input type.
- On the other hand, the file "train_pairs_negative_Ent8_inputtype_title+CTinfo.p" refers to negative pair data collected using trial title and semi-structured key design factors as input type, for only trials with EC numbers of 8 or more reported through clinicaltrials.gov.
2) original-rephrased EC pairs: The original-rephrased EC pairs data used to develop the CReSE model. EC rephrasing was performed using ChatGPT (gpt-3.5-turbo).
3) Clinical relevance data between EC pairs: A dataset evaluating the clinical relevance between different ECs created to evaluate the EC clustering performance of the CReSE model. It was also created using ChatGPT (gpt-3.5-turbo).
Please refer to our paper for more specific data generation conditions and related prompts.
| 1,557 | [
[
-0.035064697265625,
-0.042633056640625,
0.03790283203125,
0.00966644287109375,
-0.0276336669921875,
-0.02008056640625,
-0.0178070068359375,
-0.020751953125,
0.0294036865234375,
0.0672607421875,
-0.020416259765625,
-0.061248779296875,
-0.038787841796875,
0.01... |
BubbleJoe/scitail_unified_input | 2023-10-11T00:52:23.000Z | [
"region:us"
] | BubbleJoe | null | null | 0 | 0 | 2023-10-11T00:52:18 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: sentence1_binary_parse
dtype: string
- name: sentence1_parse
dtype: string
- name: sentence1
dtype: string
- name: sentence2_parse
dtype: string
- name: sentence2
dtype: string
- name: annotator_labels
sequence: string
- name: gold_label
dtype: string
- name: input
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 27422381
num_examples: 23596
- name: test
num_bytes: 2447299
num_examples: 2126
- name: validation
num_bytes: 1544360
num_examples: 1304
download_size: 9513186
dataset_size: 31414040
---
# Dataset Card for "scitail_unified_input"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 1,004 | [
[
-0.0235137939453125,
-0.005054473876953125,
0.0243072509765625,
0.0224151611328125,
-0.0074005126953125,
0.010711669921875,
0.020599365234375,
0.0011453628540039062,
0.06390380859375,
0.0288543701171875,
-0.0628662109375,
-0.048370361328125,
-0.037872314453125,
... |
walsenjond/ansawn | 2023-10-11T01:02:38.000Z | [
"region:us"
] | walsenjond | null | null | 0 | 0 | 2023-10-11T01:02:38 | Entry not found | 15 | [
[
-0.02142333984375,
-0.014984130859375,
0.057220458984375,
0.0288238525390625,
-0.03509521484375,
0.04656982421875,
0.052520751953125,
0.00506591796875,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060455322265625,
0.03793334... |
open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-7B-v3 | 2023-10-11T01:17:57.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-11T01:16:57 | ---
pretty_name: Evaluation run of Severian/ANIMA-Phi-Neptune-Mistral-7B-v3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Severian/ANIMA-Phi-Neptune-Mistral-7B-v3](https://huggingface.co/Severian/ANIMA-Phi-Neptune-Mistral-7B-v3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-7B-v3\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-11T01:16:32.937269](https://huggingface.co/datasets/open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-7B-v3/blob/main/results_2023-10-11T01-16-32.937269.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5392786633208031,\n\
\ \"acc_stderr\": 0.03494779312823446,\n \"acc_norm\": 0.5431264898387953,\n\
\ \"acc_norm_stderr\": 0.03493217150757376,\n \"mc1\": 0.412484700122399,\n\
\ \"mc1_stderr\": 0.01723329939957122,\n \"mc2\": 0.5940288949043588,\n\
\ \"mc2_stderr\": 0.015208554054531144\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5401023890784983,\n \"acc_stderr\": 0.01456431885692485,\n\
\ \"acc_norm\": 0.568259385665529,\n \"acc_norm_stderr\": 0.014474591427196202\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5893248356901015,\n\
\ \"acc_stderr\": 0.004909509538525159,\n \"acc_norm\": 0.7881896036646087,\n\
\ \"acc_norm_stderr\": 0.004077561349272391\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.04063302731486671,\n\
\ \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.04063302731486671\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6377358490566037,\n \"acc_stderr\": 0.0295822451283843,\n\
\ \"acc_norm\": 0.6377358490566037,\n \"acc_norm_stderr\": 0.0295822451283843\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5347222222222222,\n\
\ \"acc_stderr\": 0.04171115858181618,\n \"acc_norm\": 0.5347222222222222,\n\
\ \"acc_norm_stderr\": 0.04171115858181618\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5722543352601156,\n\
\ \"acc_stderr\": 0.037724468575180276,\n \"acc_norm\": 0.5722543352601156,\n\
\ \"acc_norm_stderr\": 0.037724468575180276\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077615,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077615\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n\
\ \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4425531914893617,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.4425531914893617,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n\
\ \"acc_stderr\": 0.04579639422070435,\n \"acc_norm\": 0.38596491228070173,\n\
\ \"acc_norm_stderr\": 0.04579639422070435\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3439153439153439,\n \"acc_stderr\": 0.024464426625596433,\n \"\
acc_norm\": 0.3439153439153439,\n \"acc_norm_stderr\": 0.024464426625596433\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.0436031486007746,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.0436031486007746\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6161290322580645,\n\
\ \"acc_stderr\": 0.02766618207553965,\n \"acc_norm\": 0.6161290322580645,\n\
\ \"acc_norm_stderr\": 0.02766618207553965\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.37438423645320196,\n \"acc_stderr\": 0.03405155380561952,\n\
\ \"acc_norm\": 0.37438423645320196,\n \"acc_norm_stderr\": 0.03405155380561952\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.03713158067481913,\n\
\ \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.03713158067481913\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6565656565656566,\n \"acc_stderr\": 0.03383201223244441,\n \"\
acc_norm\": 0.6565656565656566,\n \"acc_norm_stderr\": 0.03383201223244441\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7357512953367875,\n \"acc_stderr\": 0.03182155050916645,\n\
\ \"acc_norm\": 0.7357512953367875,\n \"acc_norm_stderr\": 0.03182155050916645\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.47692307692307695,\n \"acc_stderr\": 0.025323990861736118,\n\
\ \"acc_norm\": 0.47692307692307695,\n \"acc_norm_stderr\": 0.025323990861736118\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871937,\n \
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871937\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4957983193277311,\n \"acc_stderr\": 0.0324773433444811,\n \
\ \"acc_norm\": 0.4957983193277311,\n \"acc_norm_stderr\": 0.0324773433444811\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7229357798165138,\n \"acc_stderr\": 0.01918848259016953,\n \"\
acc_norm\": 0.7229357798165138,\n \"acc_norm_stderr\": 0.01918848259016953\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.39814814814814814,\n \"acc_stderr\": 0.033384734032074016,\n \"\
acc_norm\": 0.39814814814814814,\n \"acc_norm_stderr\": 0.033384734032074016\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6519607843137255,\n \"acc_stderr\": 0.03343311240488418,\n \"\
acc_norm\": 0.6519607843137255,\n \"acc_norm_stderr\": 0.03343311240488418\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7088607594936709,\n \"acc_stderr\": 0.02957160106575337,\n \
\ \"acc_norm\": 0.7088607594936709,\n \"acc_norm_stderr\": 0.02957160106575337\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6098654708520179,\n\
\ \"acc_stderr\": 0.03273766725459157,\n \"acc_norm\": 0.6098654708520179,\n\
\ \"acc_norm_stderr\": 0.03273766725459157\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6183206106870229,\n \"acc_stderr\": 0.0426073515764456,\n\
\ \"acc_norm\": 0.6183206106870229,\n \"acc_norm_stderr\": 0.0426073515764456\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6446280991735537,\n \"acc_stderr\": 0.0436923632657398,\n \"acc_norm\"\
: 0.6446280991735537,\n \"acc_norm_stderr\": 0.0436923632657398\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6111111111111112,\n\
\ \"acc_stderr\": 0.04712821257426769,\n \"acc_norm\": 0.6111111111111112,\n\
\ \"acc_norm_stderr\": 0.04712821257426769\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6196319018404908,\n \"acc_stderr\": 0.038142698932618374,\n\
\ \"acc_norm\": 0.6196319018404908,\n \"acc_norm_stderr\": 0.038142698932618374\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.04453254836326468,\n\
\ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.04453254836326468\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8290598290598291,\n\
\ \"acc_stderr\": 0.024662496845209818,\n \"acc_norm\": 0.8290598290598291,\n\
\ \"acc_norm_stderr\": 0.024662496845209818\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7088122605363985,\n\
\ \"acc_stderr\": 0.016246087069701407,\n \"acc_norm\": 0.7088122605363985,\n\
\ \"acc_norm_stderr\": 0.016246087069701407\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5664739884393064,\n \"acc_stderr\": 0.02668013476167922,\n\
\ \"acc_norm\": 0.5664739884393064,\n \"acc_norm_stderr\": 0.02668013476167922\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.34972067039106147,\n\
\ \"acc_stderr\": 0.015949308790233638,\n \"acc_norm\": 0.34972067039106147,\n\
\ \"acc_norm_stderr\": 0.015949308790233638\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5686274509803921,\n \"acc_stderr\": 0.028358956313423545,\n\
\ \"acc_norm\": 0.5686274509803921,\n \"acc_norm_stderr\": 0.028358956313423545\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6302250803858521,\n\
\ \"acc_stderr\": 0.027417996705630998,\n \"acc_norm\": 0.6302250803858521,\n\
\ \"acc_norm_stderr\": 0.027417996705630998\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6018518518518519,\n \"acc_stderr\": 0.027237415094592474,\n\
\ \"acc_norm\": 0.6018518518518519,\n \"acc_norm_stderr\": 0.027237415094592474\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3723404255319149,\n \"acc_stderr\": 0.028838921471251455,\n \
\ \"acc_norm\": 0.3723404255319149,\n \"acc_norm_stderr\": 0.028838921471251455\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.38722294654498046,\n\
\ \"acc_stderr\": 0.012441155326854927,\n \"acc_norm\": 0.38722294654498046,\n\
\ \"acc_norm_stderr\": 0.012441155326854927\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4852941176470588,\n \"acc_stderr\": 0.03035969707904611,\n\
\ \"acc_norm\": 0.4852941176470588,\n \"acc_norm_stderr\": 0.03035969707904611\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5359477124183006,\n \"acc_stderr\": 0.02017548876548405,\n \
\ \"acc_norm\": 0.5359477124183006,\n \"acc_norm_stderr\": 0.02017548876548405\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\
\ \"acc_stderr\": 0.04709306978661895,\n \"acc_norm\": 0.5909090909090909,\n\
\ \"acc_norm_stderr\": 0.04709306978661895\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6530612244897959,\n \"acc_stderr\": 0.030472526026726492,\n\
\ \"acc_norm\": 0.6530612244897959,\n \"acc_norm_stderr\": 0.030472526026726492\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7164179104477612,\n\
\ \"acc_stderr\": 0.03187187537919796,\n \"acc_norm\": 0.7164179104477612,\n\
\ \"acc_norm_stderr\": 0.03187187537919796\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.4578313253012048,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7251461988304093,\n \"acc_stderr\": 0.034240429246915824,\n\
\ \"acc_norm\": 0.7251461988304093,\n \"acc_norm_stderr\": 0.034240429246915824\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.412484700122399,\n\
\ \"mc1_stderr\": 0.01723329939957122,\n \"mc2\": 0.5940288949043588,\n\
\ \"mc2_stderr\": 0.015208554054531144\n }\n}\n```"
repo_url: https://huggingface.co/Severian/ANIMA-Phi-Neptune-Mistral-7B-v3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|arc:challenge|25_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hellaswag|10_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T01-16-32.937269.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T01-16-32.937269.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T01-16-32.937269.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T01-16-32.937269.parquet'
- config_name: results
data_files:
- split: 2023_10_11T01_16_32.937269
path:
- results_2023-10-11T01-16-32.937269.parquet
- split: latest
path:
- results_2023-10-11T01-16-32.937269.parquet
---
# Dataset Card for Evaluation run of Severian/ANIMA-Phi-Neptune-Mistral-7B-v3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Severian/ANIMA-Phi-Neptune-Mistral-7B-v3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Severian/ANIMA-Phi-Neptune-Mistral-7B-v3](https://huggingface.co/Severian/ANIMA-Phi-Neptune-Mistral-7B-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-7B-v3",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-11T01:16:32.937269](https://huggingface.co/datasets/open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-7B-v3/blob/main/results_2023-10-11T01-16-32.937269.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5392786633208031,
"acc_stderr": 0.03494779312823446,
"acc_norm": 0.5431264898387953,
"acc_norm_stderr": 0.03493217150757376,
"mc1": 0.412484700122399,
"mc1_stderr": 0.01723329939957122,
"mc2": 0.5940288949043588,
"mc2_stderr": 0.015208554054531144
},
"harness|arc:challenge|25": {
"acc": 0.5401023890784983,
"acc_stderr": 0.01456431885692485,
"acc_norm": 0.568259385665529,
"acc_norm_stderr": 0.014474591427196202
},
"harness|hellaswag|10": {
"acc": 0.5893248356901015,
"acc_stderr": 0.004909509538525159,
"acc_norm": 0.7881896036646087,
"acc_norm_stderr": 0.004077561349272391
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.04063302731486671,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.04063302731486671
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6377358490566037,
"acc_stderr": 0.0295822451283843,
"acc_norm": 0.6377358490566037,
"acc_norm_stderr": 0.0295822451283843
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5347222222222222,
"acc_stderr": 0.04171115858181618,
"acc_norm": 0.5347222222222222,
"acc_norm_stderr": 0.04171115858181618
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5722543352601156,
"acc_stderr": 0.037724468575180276,
"acc_norm": 0.5722543352601156,
"acc_norm_stderr": 0.037724468575180276
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.045766654032077615,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.045766654032077615
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4425531914893617,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.4425531914893617,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.38596491228070173,
"acc_stderr": 0.04579639422070435,
"acc_norm": 0.38596491228070173,
"acc_norm_stderr": 0.04579639422070435
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3439153439153439,
"acc_stderr": 0.024464426625596433,
"acc_norm": 0.3439153439153439,
"acc_norm_stderr": 0.024464426625596433
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.0436031486007746,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.0436031486007746
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6161290322580645,
"acc_stderr": 0.02766618207553965,
"acc_norm": 0.6161290322580645,
"acc_norm_stderr": 0.02766618207553965
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.37438423645320196,
"acc_stderr": 0.03405155380561952,
"acc_norm": 0.37438423645320196,
"acc_norm_stderr": 0.03405155380561952
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.03713158067481913,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.03713158067481913
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6565656565656566,
"acc_stderr": 0.03383201223244441,
"acc_norm": 0.6565656565656566,
"acc_norm_stderr": 0.03383201223244441
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7357512953367875,
"acc_stderr": 0.03182155050916645,
"acc_norm": 0.7357512953367875,
"acc_norm_stderr": 0.03182155050916645
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.47692307692307695,
"acc_stderr": 0.025323990861736118,
"acc_norm": 0.47692307692307695,
"acc_norm_stderr": 0.025323990861736118
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.027840811495871937,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.027840811495871937
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4957983193277311,
"acc_stderr": 0.0324773433444811,
"acc_norm": 0.4957983193277311,
"acc_norm_stderr": 0.0324773433444811
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7229357798165138,
"acc_stderr": 0.01918848259016953,
"acc_norm": 0.7229357798165138,
"acc_norm_stderr": 0.01918848259016953
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39814814814814814,
"acc_stderr": 0.033384734032074016,
"acc_norm": 0.39814814814814814,
"acc_norm_stderr": 0.033384734032074016
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6519607843137255,
"acc_stderr": 0.03343311240488418,
"acc_norm": 0.6519607843137255,
"acc_norm_stderr": 0.03343311240488418
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7088607594936709,
"acc_stderr": 0.02957160106575337,
"acc_norm": 0.7088607594936709,
"acc_norm_stderr": 0.02957160106575337
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6098654708520179,
"acc_stderr": 0.03273766725459157,
"acc_norm": 0.6098654708520179,
"acc_norm_stderr": 0.03273766725459157
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6183206106870229,
"acc_stderr": 0.0426073515764456,
"acc_norm": 0.6183206106870229,
"acc_norm_stderr": 0.0426073515764456
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6446280991735537,
"acc_stderr": 0.0436923632657398,
"acc_norm": 0.6446280991735537,
"acc_norm_stderr": 0.0436923632657398
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.04712821257426769,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.04712821257426769
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6196319018404908,
"acc_stderr": 0.038142698932618374,
"acc_norm": 0.6196319018404908,
"acc_norm_stderr": 0.038142698932618374
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.04453254836326468,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.04453254836326468
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8290598290598291,
"acc_stderr": 0.024662496845209818,
"acc_norm": 0.8290598290598291,
"acc_norm_stderr": 0.024662496845209818
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7088122605363985,
"acc_stderr": 0.016246087069701407,
"acc_norm": 0.7088122605363985,
"acc_norm_stderr": 0.016246087069701407
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5664739884393064,
"acc_stderr": 0.02668013476167922,
"acc_norm": 0.5664739884393064,
"acc_norm_stderr": 0.02668013476167922
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.34972067039106147,
"acc_stderr": 0.015949308790233638,
"acc_norm": 0.34972067039106147,
"acc_norm_stderr": 0.015949308790233638
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5686274509803921,
"acc_stderr": 0.028358956313423545,
"acc_norm": 0.5686274509803921,
"acc_norm_stderr": 0.028358956313423545
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6302250803858521,
"acc_stderr": 0.027417996705630998,
"acc_norm": 0.6302250803858521,
"acc_norm_stderr": 0.027417996705630998
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6018518518518519,
"acc_stderr": 0.027237415094592474,
"acc_norm": 0.6018518518518519,
"acc_norm_stderr": 0.027237415094592474
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3723404255319149,
"acc_stderr": 0.028838921471251455,
"acc_norm": 0.3723404255319149,
"acc_norm_stderr": 0.028838921471251455
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.38722294654498046,
"acc_stderr": 0.012441155326854927,
"acc_norm": 0.38722294654498046,
"acc_norm_stderr": 0.012441155326854927
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4852941176470588,
"acc_stderr": 0.03035969707904611,
"acc_norm": 0.4852941176470588,
"acc_norm_stderr": 0.03035969707904611
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5359477124183006,
"acc_stderr": 0.02017548876548405,
"acc_norm": 0.5359477124183006,
"acc_norm_stderr": 0.02017548876548405
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661895,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661895
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6530612244897959,
"acc_stderr": 0.030472526026726492,
"acc_norm": 0.6530612244897959,
"acc_norm_stderr": 0.030472526026726492
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7164179104477612,
"acc_stderr": 0.03187187537919796,
"acc_norm": 0.7164179104477612,
"acc_norm_stderr": 0.03187187537919796
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7251461988304093,
"acc_stderr": 0.034240429246915824,
"acc_norm": 0.7251461988304093,
"acc_norm_stderr": 0.034240429246915824
},
"harness|truthfulqa:mc|0": {
"mc1": 0.412484700122399,
"mc1_stderr": 0.01723329939957122,
"mc2": 0.5940288949043588,
"mc2_stderr": 0.015208554054531144
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 65,099 | [
[
-0.0491943359375,
-0.05938720703125,
0.02105712890625,
0.01424407958984375,
-0.0117645263671875,
-0.006809234619140625,
0.0020427703857421875,
-0.0131683349609375,
0.03973388671875,
-0.0005230903625488281,
-0.033660888671875,
-0.04705810546875,
-0.03010559082031... |
open-llm-leaderboard/details_llm-agents__tora-70b-v1.0 | 2023-10-28T23:05:01.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-11T01:55:36 | ---
pretty_name: Evaluation run of llm-agents/tora-70b-v1.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [llm-agents/tora-70b-v1.0](https://huggingface.co/llm-agents/tora-70b-v1.0) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_llm-agents__tora-70b-v1.0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-28T23:04:49.210564](https://huggingface.co/datasets/open-llm-leaderboard/details_llm-agents__tora-70b-v1.0/blob/main/results_2023-10-28T23-04-49.210564.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.3409186241610738,\n\
\ \"em_stderr\": 0.004854388549221249,\n \"f1\": 0.40523280201342454,\n\
\ \"f1_stderr\": 0.004724035643302926,\n \"acc\": 0.5286586128425962,\n\
\ \"acc_stderr\": 0.011273094879017436\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.3409186241610738,\n \"em_stderr\": 0.004854388549221249,\n\
\ \"f1\": 0.40523280201342454,\n \"f1_stderr\": 0.004724035643302926\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.23805913570887036,\n \
\ \"acc_stderr\": 0.011731278748420892\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.819258089976322,\n \"acc_stderr\": 0.010814911009613978\n\
\ }\n}\n```"
repo_url: https://huggingface.co/llm-agents/tora-70b-v1.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|arc:challenge|25_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_28T23_04_49.210564
path:
- '**/details_harness|drop|3_2023-10-28T23-04-49.210564.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-28T23-04-49.210564.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_28T23_04_49.210564
path:
- '**/details_harness|gsm8k|5_2023-10-28T23-04-49.210564.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-28T23-04-49.210564.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hellaswag|10_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T01-55-12.712768.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T01-55-12.712768.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T01-55-12.712768.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_28T23_04_49.210564
path:
- '**/details_harness|winogrande|5_2023-10-28T23-04-49.210564.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-28T23-04-49.210564.parquet'
- config_name: results
data_files:
- split: 2023_10_11T01_55_12.712768
path:
- results_2023-10-11T01-55-12.712768.parquet
- split: 2023_10_28T23_04_49.210564
path:
- results_2023-10-28T23-04-49.210564.parquet
- split: latest
path:
- results_2023-10-28T23-04-49.210564.parquet
---
# Dataset Card for Evaluation run of llm-agents/tora-70b-v1.0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/llm-agents/tora-70b-v1.0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [llm-agents/tora-70b-v1.0](https://huggingface.co/llm-agents/tora-70b-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_llm-agents__tora-70b-v1.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-28T23:04:49.210564](https://huggingface.co/datasets/open-llm-leaderboard/details_llm-agents__tora-70b-v1.0/blob/main/results_2023-10-28T23-04-49.210564.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.3409186241610738,
"em_stderr": 0.004854388549221249,
"f1": 0.40523280201342454,
"f1_stderr": 0.004724035643302926,
"acc": 0.5286586128425962,
"acc_stderr": 0.011273094879017436
},
"harness|drop|3": {
"em": 0.3409186241610738,
"em_stderr": 0.004854388549221249,
"f1": 0.40523280201342454,
"f1_stderr": 0.004724035643302926
},
"harness|gsm8k|5": {
"acc": 0.23805913570887036,
"acc_stderr": 0.011731278748420892
},
"harness|winogrande|5": {
"acc": 0.819258089976322,
"acc_stderr": 0.010814911009613978
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,594 | [
[
-0.026397705078125,
-0.0460205078125,
0.0210113525390625,
0.0168914794921875,
-0.01180267333984375,
0.0195770263671875,
-0.019073486328125,
-0.012542724609375,
0.0384521484375,
0.041961669921875,
-0.052093505859375,
-0.071533203125,
-0.04638671875,
0.0180664... |
AmelieSchreiber/ptm_500K | 2023-10-11T05:47:56.000Z | [
"license:mit",
"region:us"
] | AmelieSchreiber | null | null | 0 | 0 | 2023-10-11T02:07:52 | ---
license: mit
---
# Post Translational Modification 500K Dataset
This dataset was created from UniProt using
[this notebook](https://huggingface.co/datasets/AmelieSchreiber/ptm_500K/blob/main/ptm_data_preprocessing.ipynb). | 229 | [
[
-0.02105712890625,
-0.0198974609375,
0.00931549072265625,
0.03265380859375,
-0.031097412109375,
0.003574371337890625,
-0.0013980865478515625,
-0.0025043487548828125,
0.0501708984375,
0.07781982421875,
-0.038726806640625,
-0.04931640625,
-0.043914794921875,
0... |
wav2gloss/odin | 2023-10-11T03:26:46.000Z | [
"license:cc-by-4.0",
"region:us"
] | wav2gloss | null | null | 0 | 0 | 2023-10-11T02:53:25 | ---
license: cc-by-4.0
---
Adapted from ODIN (the Online Database of INterlinear glossed text). Adapted to the SIGMORPHON-2023 interlinear gloss shared task format by Nate Robinson.
## Citations
### Adapted Corpus
```bibtex
@inproceedings{he-etal-2023-sigmorefun,
title = "{S}ig{M}ore{F}un Submission to the {SIGMORPHON} Shared Task on Interlinear Glossing",
author = "He, Taiqi and
Tjuatja, Lindia and
Robinson, Nathaniel and
Watanabe, Shinji and
Mortensen, David R. and
Neubig, Graham and
Levin, Lori",
booktitle = "Proceedings of the 20th SIGMORPHON workshop on Computational Research in Phonetics, Phonology, and Morphology",
month = jul,
year = "2023",
address = "Toronto, Canada",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.sigmorphon-1.22",
doi = "10.18653/v1/2023.sigmorphon-1.22",
pages = "209--216",
}
```
### Original Corpus
```bibtex
@inproceedings{xia-etal-2014-enriching,
title = "Enriching {ODIN}",
author = "Xia, Fei and
Lewis, William and
Goodman, Michael Wayne and
Crowgey, Joshua and
Bender, Emily M.",
booktitle = "Proceedings of the Ninth International Conference on Language Resources and Evaluation ({LREC}'14)",
month = may,
year = "2014",
address = "Reykjavik, Iceland",
publisher = "European Language Resources Association (ELRA)",
url = "http://www.lrec-conf.org/proceedings/lrec2014/pdf/1072_Paper.pdf",
pages = "3151--3157",
}
``` | 1,562 | [
[
-0.041473388671875,
-0.03173828125,
0.0204620361328125,
0.055511474609375,
-0.0163421630859375,
-0.004970550537109375,
-0.0244903564453125,
-0.044281005859375,
0.051055908203125,
0.0311431884765625,
-0.019622802734375,
-0.0192108154296875,
-0.043548583984375,
... |
NaNg/TestData | 2023-10-16T13:24:17.000Z | [
"region:us"
] | NaNg | null | null | 0 | 0 | 2023-10-11T02:54:27 | Entry not found | 15 | [
[
-0.021392822265625,
-0.01494598388671875,
0.05718994140625,
0.028839111328125,
-0.0350341796875,
0.046539306640625,
0.052490234375,
0.00507354736328125,
0.051361083984375,
0.01702880859375,
-0.052093505859375,
-0.01494598388671875,
-0.06036376953125,
0.03790... |
thangnd/ddpm-butterflies-128 | 2023-10-11T06:15:03.000Z | [
"region:us"
] | thangnd | null | null | 0 | 0 | 2023-10-11T03:00:18 | Entry not found | 15 | [
[
-0.021392822265625,
-0.01494598388671875,
0.05718994140625,
0.028839111328125,
-0.0350341796875,
0.046539306640625,
0.052490234375,
0.00507354736328125,
0.051361083984375,
0.01702880859375,
-0.052093505859375,
-0.01494598388671875,
-0.06036376953125,
0.03790... |
open-llm-leaderboard/details_Undi95__Mistral-11B-TestBench3 | 2023-10-11T03:18:58.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-11T03:17:59 | ---
pretty_name: Evaluation run of Undi95/Mistral-11B-TestBench3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Undi95/Mistral-11B-TestBench3](https://huggingface.co/Undi95/Mistral-11B-TestBench3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__Mistral-11B-TestBench3\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-11T03:17:36.482892](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Mistral-11B-TestBench3/blob/main/results_2023-10-11T03-17-36.482892.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6305202530532628,\n\
\ \"acc_stderr\": 0.03327541391769616,\n \"acc_norm\": 0.6344593777294493,\n\
\ \"acc_norm_stderr\": 0.03325259546245853,\n \"mc1\": 0.37209302325581395,\n\
\ \"mc1_stderr\": 0.016921090118814035,\n \"mc2\": 0.5365517381581612,\n\
\ \"mc2_stderr\": 0.01561816357163061\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5784982935153583,\n \"acc_stderr\": 0.014430197069326021,\n\
\ \"acc_norm\": 0.6203071672354948,\n \"acc_norm_stderr\": 0.014182119866974872\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6485759808803028,\n\
\ \"acc_stderr\": 0.004764393985111036,\n \"acc_norm\": 0.8391754630551683,\n\
\ \"acc_norm_stderr\": 0.0036661823284423424\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.038781398887976104,\n\
\ \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.038781398887976104\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6377358490566037,\n \"acc_stderr\": 0.0295822451283843,\n\
\ \"acc_norm\": 0.6377358490566037,\n \"acc_norm_stderr\": 0.0295822451283843\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.03745554791462456,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.03745554791462456\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\
\ \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n\
\ \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.373015873015873,\n \"acc_stderr\": 0.02490699045899257,\n \"acc_norm\"\
: 0.373015873015873,\n \"acc_norm_stderr\": 0.02490699045899257\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7483870967741936,\n\
\ \"acc_stderr\": 0.024685979286239963,\n \"acc_norm\": 0.7483870967741936,\n\
\ \"acc_norm_stderr\": 0.024685979286239963\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526066,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.047258156262526066\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026705,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026705\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015184,\n\
\ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015184\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \
\ \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.030684737115135356,\n\
\ \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.030684737115135356\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8275229357798165,\n \"acc_stderr\": 0.01619780795684805,\n \"\
acc_norm\": 0.8275229357798165,\n \"acc_norm_stderr\": 0.01619780795684805\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849313,\n\
\ \"acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849313\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7426160337552743,\n \"acc_stderr\": 0.028458820991460305,\n \
\ \"acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.028458820991460305\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.0364129708131373,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.0364129708131373\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098823,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098823\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.02220930907316562,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.02220930907316562\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8007662835249042,\n\
\ \"acc_stderr\": 0.014283378044296418,\n \"acc_norm\": 0.8007662835249042,\n\
\ \"acc_norm_stderr\": 0.014283378044296418\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6734104046242775,\n \"acc_stderr\": 0.02524826477424284,\n\
\ \"acc_norm\": 0.6734104046242775,\n \"acc_norm_stderr\": 0.02524826477424284\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33519553072625696,\n\
\ \"acc_stderr\": 0.015788007190185884,\n \"acc_norm\": 0.33519553072625696,\n\
\ \"acc_norm_stderr\": 0.015788007190185884\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.02505850331695815,\n\
\ \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.02505850331695815\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.02592237178881877,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.02592237178881877\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7160493827160493,\n \"acc_stderr\": 0.025089478523765134,\n\
\ \"acc_norm\": 0.7160493827160493,\n \"acc_norm_stderr\": 0.025089478523765134\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44654498044328556,\n\
\ \"acc_stderr\": 0.012697046024399673,\n \"acc_norm\": 0.44654498044328556,\n\
\ \"acc_norm_stderr\": 0.012697046024399673\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.028661996202335307,\n\
\ \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.028661996202335307\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6454248366013072,\n \"acc_stderr\": 0.019353360547553704,\n \
\ \"acc_norm\": 0.6454248366013072,\n \"acc_norm_stderr\": 0.019353360547553704\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n\
\ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37209302325581395,\n\
\ \"mc1_stderr\": 0.016921090118814035,\n \"mc2\": 0.5365517381581612,\n\
\ \"mc2_stderr\": 0.01561816357163061\n }\n}\n```"
repo_url: https://huggingface.co/Undi95/Mistral-11B-TestBench3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|arc:challenge|25_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hellaswag|10_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T03-17-36.482892.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T03-17-36.482892.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T03-17-36.482892.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T03-17-36.482892.parquet'
- config_name: results
data_files:
- split: 2023_10_11T03_17_36.482892
path:
- results_2023-10-11T03-17-36.482892.parquet
- split: latest
path:
- results_2023-10-11T03-17-36.482892.parquet
---
# Dataset Card for Evaluation run of Undi95/Mistral-11B-TestBench3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Undi95/Mistral-11B-TestBench3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Undi95/Mistral-11B-TestBench3](https://huggingface.co/Undi95/Mistral-11B-TestBench3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Undi95__Mistral-11B-TestBench3",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-11T03:17:36.482892](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Mistral-11B-TestBench3/blob/main/results_2023-10-11T03-17-36.482892.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6305202530532628,
"acc_stderr": 0.03327541391769616,
"acc_norm": 0.6344593777294493,
"acc_norm_stderr": 0.03325259546245853,
"mc1": 0.37209302325581395,
"mc1_stderr": 0.016921090118814035,
"mc2": 0.5365517381581612,
"mc2_stderr": 0.01561816357163061
},
"harness|arc:challenge|25": {
"acc": 0.5784982935153583,
"acc_stderr": 0.014430197069326021,
"acc_norm": 0.6203071672354948,
"acc_norm_stderr": 0.014182119866974872
},
"harness|hellaswag|10": {
"acc": 0.6485759808803028,
"acc_stderr": 0.004764393985111036,
"acc_norm": 0.8391754630551683,
"acc_norm_stderr": 0.0036661823284423424
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6513157894736842,
"acc_stderr": 0.038781398887976104,
"acc_norm": 0.6513157894736842,
"acc_norm_stderr": 0.038781398887976104
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6377358490566037,
"acc_stderr": 0.0295822451283843,
"acc_norm": 0.6377358490566037,
"acc_norm_stderr": 0.0295822451283843
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.03745554791462456,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.03745554791462456
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.02490699045899257,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.02490699045899257
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7483870967741936,
"acc_stderr": 0.024685979286239963,
"acc_norm": 0.7483870967741936,
"acc_norm_stderr": 0.024685979286239963
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526066,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526066
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026705,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026705
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015184,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015184
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.030684737115135356,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.030684737115135356
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8275229357798165,
"acc_stderr": 0.01619780795684805,
"acc_norm": 0.8275229357798165,
"acc_norm_stderr": 0.01619780795684805
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849313,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849313
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7426160337552743,
"acc_stderr": 0.028458820991460305,
"acc_norm": 0.7426160337552743,
"acc_norm_stderr": 0.028458820991460305
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.0364129708131373,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.0364129708131373
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098823,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098823
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.02220930907316562,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.02220930907316562
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8007662835249042,
"acc_stderr": 0.014283378044296418,
"acc_norm": 0.8007662835249042,
"acc_norm_stderr": 0.014283378044296418
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6734104046242775,
"acc_stderr": 0.02524826477424284,
"acc_norm": 0.6734104046242775,
"acc_norm_stderr": 0.02524826477424284
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.33519553072625696,
"acc_stderr": 0.015788007190185884,
"acc_norm": 0.33519553072625696,
"acc_norm_stderr": 0.015788007190185884
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.02505850331695815,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.02505850331695815
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.02592237178881877,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.02592237178881877
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7160493827160493,
"acc_stderr": 0.025089478523765134,
"acc_norm": 0.7160493827160493,
"acc_norm_stderr": 0.025089478523765134
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44654498044328556,
"acc_stderr": 0.012697046024399673,
"acc_norm": 0.44654498044328556,
"acc_norm_stderr": 0.012697046024399673
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.028661996202335307,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.028661996202335307
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6454248366013072,
"acc_stderr": 0.019353360547553704,
"acc_norm": 0.6454248366013072,
"acc_norm_stderr": 0.019353360547553704
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37209302325581395,
"mc1_stderr": 0.016921090118814035,
"mc2": 0.5365517381581612,
"mc2_stderr": 0.01561816357163061
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 64,818 | [
[
-0.049896240234375,
-0.056976318359375,
0.019866943359375,
0.0150604248046875,
-0.011016845703125,
-0.006488800048828125,
0.0020885467529296875,
-0.01319122314453125,
0.036712646484375,
-0.003307342529296875,
-0.03289794921875,
-0.049407958984375,
-0.02885437011... |
open-llm-leaderboard/details_Open-Orca__Mistral-7B-SlimOrca | 2023-10-24T04:55:30.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-11T03:20:26 | ---
pretty_name: Evaluation run of Open-Orca/Mistral-7B-SlimOrca
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Open-Orca/Mistral-7B-SlimOrca](https://huggingface.co/Open-Orca/Mistral-7B-SlimOrca)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Open-Orca__Mistral-7B-SlimOrca\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T04:55:17.464867](https://huggingface.co/datasets/open-llm-leaderboard/details_Open-Orca__Mistral-7B-SlimOrca/blob/main/results_2023-10-24T04-55-17.464867.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.03460570469798658,\n\
\ \"em_stderr\": 0.0018718276753995743,\n \"f1\": 0.11197776845637529,\n\
\ \"f1_stderr\": 0.002382569794079873,\n \"acc\": 0.4940341305179057,\n\
\ \"acc_stderr\": 0.011521340479768794\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.03460570469798658,\n \"em_stderr\": 0.0018718276753995743,\n\
\ \"f1\": 0.11197776845637529,\n \"f1_stderr\": 0.002382569794079873\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2137983320697498,\n \
\ \"acc_stderr\": 0.011293054698635044\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7742699289660616,\n \"acc_stderr\": 0.011749626260902543\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Open-Orca/Mistral-7B-SlimOrca
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|arc:challenge|25_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T04_55_17.464867
path:
- '**/details_harness|drop|3_2023-10-24T04-55-17.464867.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T04-55-17.464867.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T04_55_17.464867
path:
- '**/details_harness|gsm8k|5_2023-10-24T04-55-17.464867.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T04-55-17.464867.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hellaswag|10_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T03-20-03.477959.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T03-20-03.477959.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T03-20-03.477959.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T04_55_17.464867
path:
- '**/details_harness|winogrande|5_2023-10-24T04-55-17.464867.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T04-55-17.464867.parquet'
- config_name: results
data_files:
- split: 2023_10_11T03_20_03.477959
path:
- results_2023-10-11T03-20-03.477959.parquet
- split: 2023_10_24T04_55_17.464867
path:
- results_2023-10-24T04-55-17.464867.parquet
- split: latest
path:
- results_2023-10-24T04-55-17.464867.parquet
---
# Dataset Card for Evaluation run of Open-Orca/Mistral-7B-SlimOrca
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Open-Orca/Mistral-7B-SlimOrca
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Open-Orca/Mistral-7B-SlimOrca](https://huggingface.co/Open-Orca/Mistral-7B-SlimOrca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Open-Orca__Mistral-7B-SlimOrca",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T04:55:17.464867](https://huggingface.co/datasets/open-llm-leaderboard/details_Open-Orca__Mistral-7B-SlimOrca/blob/main/results_2023-10-24T04-55-17.464867.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.03460570469798658,
"em_stderr": 0.0018718276753995743,
"f1": 0.11197776845637529,
"f1_stderr": 0.002382569794079873,
"acc": 0.4940341305179057,
"acc_stderr": 0.011521340479768794
},
"harness|drop|3": {
"em": 0.03460570469798658,
"em_stderr": 0.0018718276753995743,
"f1": 0.11197776845637529,
"f1_stderr": 0.002382569794079873
},
"harness|gsm8k|5": {
"acc": 0.2137983320697498,
"acc_stderr": 0.011293054698635044
},
"harness|winogrande|5": {
"acc": 0.7742699289660616,
"acc_stderr": 0.011749626260902543
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,662 | [
[
-0.0294036865234375,
-0.04949951171875,
0.0104827880859375,
0.01010894775390625,
-0.00921630859375,
-0.0028934478759765625,
-0.0261993408203125,
-0.0166168212890625,
0.0307159423828125,
0.040679931640625,
-0.048309326171875,
-0.07177734375,
-0.04510498046875,
... |
beatbox1200/victoria-justice-dataset-test | 2023-10-11T03:31:52.000Z | [
"region:us"
] | beatbox1200 | null | null | 0 | 0 | 2023-10-11T03:27:27 | Entry not found | 15 | [
[
-0.0213775634765625,
-0.01497650146484375,
0.05718994140625,
0.02880859375,
-0.0350341796875,
0.046478271484375,
0.052490234375,
0.00507354736328125,
0.051361083984375,
0.0170135498046875,
-0.052093505859375,
-0.01497650146484375,
-0.0604248046875,
0.0379028... |
open-llm-leaderboard/details_sequelbox__StellarBright | 2023-10-11T03:36:24.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-11T03:35:24 | ---
pretty_name: Evaluation run of sequelbox/StellarBright
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [sequelbox/StellarBright](https://huggingface.co/sequelbox/StellarBright) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_sequelbox__StellarBright\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-11T03:35:00.957425](https://huggingface.co/datasets/open-llm-leaderboard/details_sequelbox__StellarBright/blob/main/results_2023-10-11T03-35-00.957425.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7109524643752221,\n\
\ \"acc_stderr\": 0.030739601585983465,\n \"acc_norm\": 0.7148315560048047,\n\
\ \"acc_norm_stderr\": 0.030707363721296215,\n \"mc1\": 0.46511627906976744,\n\
\ \"mc1_stderr\": 0.017460849975873965,\n \"mc2\": 0.6446460697306154,\n\
\ \"mc2_stderr\": 0.014753033588623255\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6885665529010239,\n \"acc_stderr\": 0.013532472099850945,\n\
\ \"acc_norm\": 0.7295221843003413,\n \"acc_norm_stderr\": 0.012980954547659556\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.690300736904999,\n\
\ \"acc_stderr\": 0.004614246282055375,\n \"acc_norm\": 0.8782115116510655,\n\
\ \"acc_norm_stderr\": 0.0032637298176987762\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.041539484047424,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.041539484047424\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8092105263157895,\n \"acc_stderr\": 0.03197565821032499,\n\
\ \"acc_norm\": 0.8092105263157895,\n \"acc_norm_stderr\": 0.03197565821032499\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.78,\n\
\ \"acc_stderr\": 0.041633319989322605,\n \"acc_norm\": 0.78,\n \
\ \"acc_norm_stderr\": 0.041633319989322605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7433962264150943,\n \"acc_stderr\": 0.026880647889051985,\n\
\ \"acc_norm\": 0.7433962264150943,\n \"acc_norm_stderr\": 0.026880647889051985\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8263888888888888,\n\
\ \"acc_stderr\": 0.03167473383795717,\n \"acc_norm\": 0.8263888888888888,\n\
\ \"acc_norm_stderr\": 0.03167473383795717\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n\
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n\
\ \"acc_stderr\": 0.03514942551267439,\n \"acc_norm\": 0.6936416184971098,\n\
\ \"acc_norm_stderr\": 0.03514942551267439\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105654,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7106382978723405,\n \"acc_stderr\": 0.02964400657700962,\n\
\ \"acc_norm\": 0.7106382978723405,\n \"acc_norm_stderr\": 0.02964400657700962\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6413793103448275,\n \"acc_stderr\": 0.039966295748767186,\n\
\ \"acc_norm\": 0.6413793103448275,\n \"acc_norm_stderr\": 0.039966295748767186\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.47883597883597884,\n \"acc_stderr\": 0.025728230952130723,\n \"\
acc_norm\": 0.47883597883597884,\n \"acc_norm_stderr\": 0.025728230952130723\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5158730158730159,\n\
\ \"acc_stderr\": 0.044698818540726076,\n \"acc_norm\": 0.5158730158730159,\n\
\ \"acc_norm_stderr\": 0.044698818540726076\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8161290322580645,\n\
\ \"acc_stderr\": 0.022037217340267826,\n \"acc_norm\": 0.8161290322580645,\n\
\ \"acc_norm_stderr\": 0.022037217340267826\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.03499113137676744,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.03499113137676744\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\"\
: 0.78,\n \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.027998073798781678,\n\
\ \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.027998073798781678\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8888888888888888,\n \"acc_stderr\": 0.022390787638216763,\n \"\
acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.022390787638216763\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9430051813471503,\n \"acc_stderr\": 0.01673108529360755,\n\
\ \"acc_norm\": 0.9430051813471503,\n \"acc_norm_stderr\": 0.01673108529360755\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7102564102564103,\n \"acc_stderr\": 0.023000628243687968,\n\
\ \"acc_norm\": 0.7102564102564103,\n \"acc_norm_stderr\": 0.023000628243687968\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253252,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253252\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7773109243697479,\n \"acc_stderr\": 0.027025433498882392,\n\
\ \"acc_norm\": 0.7773109243697479,\n \"acc_norm_stderr\": 0.027025433498882392\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4900662251655629,\n \"acc_stderr\": 0.04081677107248436,\n \"\
acc_norm\": 0.4900662251655629,\n \"acc_norm_stderr\": 0.04081677107248436\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9119266055045872,\n \"acc_stderr\": 0.01215074371948166,\n \"\
acc_norm\": 0.9119266055045872,\n \"acc_norm_stderr\": 0.01215074371948166\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.625,\n \"acc_stderr\": 0.033016908987210894,\n \"acc_norm\": 0.625,\n\
\ \"acc_norm_stderr\": 0.033016908987210894\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.9215686274509803,\n \"acc_stderr\": 0.018869514646658925,\n\
\ \"acc_norm\": 0.9215686274509803,\n \"acc_norm_stderr\": 0.018869514646658925\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8945147679324894,\n \"acc_stderr\": 0.01999556072375854,\n \
\ \"acc_norm\": 0.8945147679324894,\n \"acc_norm_stderr\": 0.01999556072375854\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7847533632286996,\n\
\ \"acc_stderr\": 0.027584066602208274,\n \"acc_norm\": 0.7847533632286996,\n\
\ \"acc_norm_stderr\": 0.027584066602208274\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8473282442748091,\n \"acc_stderr\": 0.031545216720054725,\n\
\ \"acc_norm\": 0.8473282442748091,\n \"acc_norm_stderr\": 0.031545216720054725\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035206,\n \"\
acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035206\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.03602814176392645,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.03602814176392645\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8220858895705522,\n \"acc_stderr\": 0.03004735765580662,\n\
\ \"acc_norm\": 0.8220858895705522,\n \"acc_norm_stderr\": 0.03004735765580662\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5714285714285714,\n\
\ \"acc_stderr\": 0.04697113923010213,\n \"acc_norm\": 0.5714285714285714,\n\
\ \"acc_norm_stderr\": 0.04697113923010213\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.034926064766237906,\n\
\ \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.034926064766237906\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9145299145299145,\n\
\ \"acc_stderr\": 0.018315891685625845,\n \"acc_norm\": 0.9145299145299145,\n\
\ \"acc_norm_stderr\": 0.018315891685625845\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542126,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542126\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8710089399744572,\n\
\ \"acc_stderr\": 0.011986371548086867,\n \"acc_norm\": 0.8710089399744572,\n\
\ \"acc_norm_stderr\": 0.011986371548086867\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7832369942196532,\n \"acc_stderr\": 0.022183477668412856,\n\
\ \"acc_norm\": 0.7832369942196532,\n \"acc_norm_stderr\": 0.022183477668412856\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6245810055865921,\n\
\ \"acc_stderr\": 0.01619510424846353,\n \"acc_norm\": 0.6245810055865921,\n\
\ \"acc_norm_stderr\": 0.01619510424846353\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7877813504823151,\n\
\ \"acc_stderr\": 0.023222756797435115,\n \"acc_norm\": 0.7877813504823151,\n\
\ \"acc_norm_stderr\": 0.023222756797435115\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.020736358408060002,\n\
\ \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.020736358408060002\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5957446808510638,\n \"acc_stderr\": 0.02927553215970472,\n \
\ \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.02927553215970472\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5827900912646675,\n\
\ \"acc_stderr\": 0.012593959992906427,\n \"acc_norm\": 0.5827900912646675,\n\
\ \"acc_norm_stderr\": 0.012593959992906427\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.026303648393696036,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.026303648393696036\n \
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\"\
: 0.7728758169934641,\n \"acc_stderr\": 0.016949853279212373,\n \"\
acc_norm\": 0.7728758169934641,\n \"acc_norm_stderr\": 0.016949853279212373\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7454545454545455,\n\
\ \"acc_stderr\": 0.041723430387053825,\n \"acc_norm\": 0.7454545454545455,\n\
\ \"acc_norm_stderr\": 0.041723430387053825\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7918367346938775,\n \"acc_stderr\": 0.0259911176728133,\n\
\ \"acc_norm\": 0.7918367346938775,\n \"acc_norm_stderr\": 0.0259911176728133\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8805970149253731,\n\
\ \"acc_stderr\": 0.02292879327721974,\n \"acc_norm\": 0.8805970149253731,\n\
\ \"acc_norm_stderr\": 0.02292879327721974\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015575,\n\
\ \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015575\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.46511627906976744,\n\
\ \"mc1_stderr\": 0.017460849975873965,\n \"mc2\": 0.6446460697306154,\n\
\ \"mc2_stderr\": 0.014753033588623255\n }\n}\n```"
repo_url: https://huggingface.co/sequelbox/StellarBright
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|arc:challenge|25_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hellaswag|10_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T03-35-00.957425.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T03-35-00.957425.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T03-35-00.957425.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T03-35-00.957425.parquet'
- config_name: results
data_files:
- split: 2023_10_11T03_35_00.957425
path:
- results_2023-10-11T03-35-00.957425.parquet
- split: latest
path:
- results_2023-10-11T03-35-00.957425.parquet
---
# Dataset Card for Evaluation run of sequelbox/StellarBright
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/sequelbox/StellarBright
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [sequelbox/StellarBright](https://huggingface.co/sequelbox/StellarBright) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_sequelbox__StellarBright",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-11T03:35:00.957425](https://huggingface.co/datasets/open-llm-leaderboard/details_sequelbox__StellarBright/blob/main/results_2023-10-11T03-35-00.957425.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7109524643752221,
"acc_stderr": 0.030739601585983465,
"acc_norm": 0.7148315560048047,
"acc_norm_stderr": 0.030707363721296215,
"mc1": 0.46511627906976744,
"mc1_stderr": 0.017460849975873965,
"mc2": 0.6446460697306154,
"mc2_stderr": 0.014753033588623255
},
"harness|arc:challenge|25": {
"acc": 0.6885665529010239,
"acc_stderr": 0.013532472099850945,
"acc_norm": 0.7295221843003413,
"acc_norm_stderr": 0.012980954547659556
},
"harness|hellaswag|10": {
"acc": 0.690300736904999,
"acc_stderr": 0.004614246282055375,
"acc_norm": 0.8782115116510655,
"acc_norm_stderr": 0.0032637298176987762
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.041539484047424,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.041539484047424
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8092105263157895,
"acc_stderr": 0.03197565821032499,
"acc_norm": 0.8092105263157895,
"acc_norm_stderr": 0.03197565821032499
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.78,
"acc_stderr": 0.041633319989322605,
"acc_norm": 0.78,
"acc_norm_stderr": 0.041633319989322605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7433962264150943,
"acc_stderr": 0.026880647889051985,
"acc_norm": 0.7433962264150943,
"acc_norm_stderr": 0.026880647889051985
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8263888888888888,
"acc_stderr": 0.03167473383795717,
"acc_norm": 0.8263888888888888,
"acc_norm_stderr": 0.03167473383795717
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.03514942551267439,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.03514942551267439
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105654,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7106382978723405,
"acc_stderr": 0.02964400657700962,
"acc_norm": 0.7106382978723405,
"acc_norm_stderr": 0.02964400657700962
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04697085136647863,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04697085136647863
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6413793103448275,
"acc_stderr": 0.039966295748767186,
"acc_norm": 0.6413793103448275,
"acc_norm_stderr": 0.039966295748767186
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.47883597883597884,
"acc_stderr": 0.025728230952130723,
"acc_norm": 0.47883597883597884,
"acc_norm_stderr": 0.025728230952130723
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5158730158730159,
"acc_stderr": 0.044698818540726076,
"acc_norm": 0.5158730158730159,
"acc_norm_stderr": 0.044698818540726076
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8161290322580645,
"acc_stderr": 0.022037217340267826,
"acc_norm": 0.8161290322580645,
"acc_norm_stderr": 0.022037217340267826
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.03499113137676744,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.03499113137676744
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.027998073798781678,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.027998073798781678
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.022390787638216763,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.022390787638216763
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9430051813471503,
"acc_stderr": 0.01673108529360755,
"acc_norm": 0.9430051813471503,
"acc_norm_stderr": 0.01673108529360755
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7102564102564103,
"acc_stderr": 0.023000628243687968,
"acc_norm": 0.7102564102564103,
"acc_norm_stderr": 0.023000628243687968
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253252,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253252
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7773109243697479,
"acc_stderr": 0.027025433498882392,
"acc_norm": 0.7773109243697479,
"acc_norm_stderr": 0.027025433498882392
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4900662251655629,
"acc_stderr": 0.04081677107248436,
"acc_norm": 0.4900662251655629,
"acc_norm_stderr": 0.04081677107248436
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9119266055045872,
"acc_stderr": 0.01215074371948166,
"acc_norm": 0.9119266055045872,
"acc_norm_stderr": 0.01215074371948166
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.625,
"acc_stderr": 0.033016908987210894,
"acc_norm": 0.625,
"acc_norm_stderr": 0.033016908987210894
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9215686274509803,
"acc_stderr": 0.018869514646658925,
"acc_norm": 0.9215686274509803,
"acc_norm_stderr": 0.018869514646658925
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8945147679324894,
"acc_stderr": 0.01999556072375854,
"acc_norm": 0.8945147679324894,
"acc_norm_stderr": 0.01999556072375854
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7847533632286996,
"acc_stderr": 0.027584066602208274,
"acc_norm": 0.7847533632286996,
"acc_norm_stderr": 0.027584066602208274
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8473282442748091,
"acc_stderr": 0.031545216720054725,
"acc_norm": 0.8473282442748091,
"acc_norm_stderr": 0.031545216720054725
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035206,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035206
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.03602814176392645,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.03602814176392645
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8220858895705522,
"acc_stderr": 0.03004735765580662,
"acc_norm": 0.8220858895705522,
"acc_norm_stderr": 0.03004735765580662
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.04697113923010213,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.04697113923010213
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.034926064766237906,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.034926064766237906
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9145299145299145,
"acc_stderr": 0.018315891685625845,
"acc_norm": 0.9145299145299145,
"acc_norm_stderr": 0.018315891685625845
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542126,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542126
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8710089399744572,
"acc_stderr": 0.011986371548086867,
"acc_norm": 0.8710089399744572,
"acc_norm_stderr": 0.011986371548086867
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7832369942196532,
"acc_stderr": 0.022183477668412856,
"acc_norm": 0.7832369942196532,
"acc_norm_stderr": 0.022183477668412856
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6245810055865921,
"acc_stderr": 0.01619510424846353,
"acc_norm": 0.6245810055865921,
"acc_norm_stderr": 0.01619510424846353
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7877813504823151,
"acc_stderr": 0.023222756797435115,
"acc_norm": 0.7877813504823151,
"acc_norm_stderr": 0.023222756797435115
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.020736358408060002,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.020736358408060002
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5957446808510638,
"acc_stderr": 0.02927553215970472,
"acc_norm": 0.5957446808510638,
"acc_norm_stderr": 0.02927553215970472
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5827900912646675,
"acc_stderr": 0.012593959992906427,
"acc_norm": 0.5827900912646675,
"acc_norm_stderr": 0.012593959992906427
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.75,
"acc_stderr": 0.026303648393696036,
"acc_norm": 0.75,
"acc_norm_stderr": 0.026303648393696036
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7728758169934641,
"acc_stderr": 0.016949853279212373,
"acc_norm": 0.7728758169934641,
"acc_norm_stderr": 0.016949853279212373
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.041723430387053825,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.041723430387053825
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7918367346938775,
"acc_stderr": 0.0259911176728133,
"acc_norm": 0.7918367346938775,
"acc_norm_stderr": 0.0259911176728133
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8805970149253731,
"acc_stderr": 0.02292879327721974,
"acc_norm": 0.8805970149253731,
"acc_norm_stderr": 0.02292879327721974
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015575,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015575
},
"harness|truthfulqa:mc|0": {
"mc1": 0.46511627906976744,
"mc1_stderr": 0.017460849975873965,
"mc2": 0.6446460697306154,
"mc2_stderr": 0.014753033588623255
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 64,811 | [
[
-0.048187255859375,
-0.058929443359375,
0.021514892578125,
0.01200103759765625,
-0.00846099853515625,
-0.0005822181701660156,
0.006412506103515625,
-0.01377105712890625,
0.037750244140625,
-0.00215911865234375,
-0.0367431640625,
-0.049285888671875,
-0.0321655273... |
open-llm-leaderboard/details_nicholasKluge__Aira-2-1B1 | 2023-10-28T21:10:21.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-11T03:37:19 | ---
pretty_name: Evaluation run of nicholasKluge/Aira-2-1B1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [nicholasKluge/Aira-2-1B1](https://huggingface.co/nicholasKluge/Aira-2-1B1) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nicholasKluge__Aira-2-1B1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-28T21:10:09.123262](https://huggingface.co/datasets/open-llm-leaderboard/details_nicholasKluge__Aira-2-1B1/blob/main/results_2023-10-28T21-10-09.123262.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0,\n \"\
em_stderr\": 0.0,\n \"f1\": 0.003932466442953021,\n \"f1_stderr\"\
: 0.00031476990050976393,\n \"acc\": 0.2513812154696133,\n \"acc_stderr\"\
: 0.007026135605808218\n },\n \"harness|drop|3\": {\n \"em\": 0.0,\n\
\ \"em_stderr\": 0.0,\n \"f1\": 0.003932466442953021,\n \"\
f1_stderr\": 0.00031476990050976393\n },\n \"harness|gsm8k|5\": {\n \
\ \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5027624309392266,\n \"acc_stderr\": 0.014052271211616436\n\
\ }\n}\n```"
repo_url: https://huggingface.co/nicholasKluge/Aira-2-1B1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|arc:challenge|25_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_28T21_10_09.123262
path:
- '**/details_harness|drop|3_2023-10-28T21-10-09.123262.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-28T21-10-09.123262.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_28T21_10_09.123262
path:
- '**/details_harness|gsm8k|5_2023-10-28T21-10-09.123262.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-28T21-10-09.123262.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hellaswag|10_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T03-37-00.814670.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T03-37-00.814670.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T03-37-00.814670.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_28T21_10_09.123262
path:
- '**/details_harness|winogrande|5_2023-10-28T21-10-09.123262.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-28T21-10-09.123262.parquet'
- config_name: results
data_files:
- split: 2023_10_11T03_37_00.814670
path:
- results_2023-10-11T03-37-00.814670.parquet
- split: 2023_10_28T21_10_09.123262
path:
- results_2023-10-28T21-10-09.123262.parquet
- split: latest
path:
- results_2023-10-28T21-10-09.123262.parquet
---
# Dataset Card for Evaluation run of nicholasKluge/Aira-2-1B1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/nicholasKluge/Aira-2-1B1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [nicholasKluge/Aira-2-1B1](https://huggingface.co/nicholasKluge/Aira-2-1B1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nicholasKluge__Aira-2-1B1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-28T21:10:09.123262](https://huggingface.co/datasets/open-llm-leaderboard/details_nicholasKluge__Aira-2-1B1/blob/main/results_2023-10-28T21-10-09.123262.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0,
"em_stderr": 0.0,
"f1": 0.003932466442953021,
"f1_stderr": 0.00031476990050976393,
"acc": 0.2513812154696133,
"acc_stderr": 0.007026135605808218
},
"harness|drop|3": {
"em": 0.0,
"em_stderr": 0.0,
"f1": 0.003932466442953021,
"f1_stderr": 0.00031476990050976393
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5027624309392266,
"acc_stderr": 0.014052271211616436
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,405 | [
[
-0.0236358642578125,
-0.0489501953125,
0.01149749755859375,
0.01325225830078125,
-0.00827789306640625,
0.0034351348876953125,
-0.0240020751953125,
-0.013580322265625,
0.0357666015625,
0.03253173828125,
-0.05389404296875,
-0.059967041015625,
-0.0517578125,
0.... |
DangFutures/test | 2023-10-12T14:49:34.000Z | [
"region:us"
] | DangFutures | null | null | 0 | 0 | 2023-10-11T03:54:13 | Entry not found | 15 | [
[
-0.0213775634765625,
-0.01497650146484375,
0.05718994140625,
0.02880859375,
-0.0350341796875,
0.046478271484375,
0.052490234375,
0.00507354736328125,
0.051361083984375,
0.0170135498046875,
-0.052093505859375,
-0.01497650146484375,
-0.0604248046875,
0.0379028... |
open-llm-leaderboard/details_openaccess-ai-collective__jackalope-7b | 2023-10-24T19:34:34.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-11T04:09:03 | ---
pretty_name: Evaluation run of openaccess-ai-collective/jackalope-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [openaccess-ai-collective/jackalope-7b](https://huggingface.co/openaccess-ai-collective/jackalope-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openaccess-ai-collective__jackalope-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T19:34:20.159933](https://huggingface.co/datasets/open-llm-leaderboard/details_openaccess-ai-collective__jackalope-7b/blob/main/results_2023-10-24T19-34-20.159933.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.008703859060402684,\n\
\ \"em_stderr\": 0.0009512557261398897,\n \"f1\": 0.07785130033557026,\n\
\ \"f1_stderr\": 0.0016803312427089365,\n \"acc\": 0.5335823999071311,\n\
\ \"acc_stderr\": 0.012043055014472743\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.008703859060402684,\n \"em_stderr\": 0.0009512557261398897,\n\
\ \"f1\": 0.07785130033557026,\n \"f1_stderr\": 0.0016803312427089365\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.28658074298711145,\n \
\ \"acc_stderr\": 0.012454841668337704\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7805840568271507,\n \"acc_stderr\": 0.01163126836060778\n\
\ }\n}\n```"
repo_url: https://huggingface.co/openaccess-ai-collective/jackalope-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|arc:challenge|25_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T19_34_20.159933
path:
- '**/details_harness|drop|3_2023-10-24T19-34-20.159933.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T19-34-20.159933.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T19_34_20.159933
path:
- '**/details_harness|gsm8k|5_2023-10-24T19-34-20.159933.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T19-34-20.159933.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hellaswag|10_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T04-08-39.650186.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T04-08-39.650186.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T04-08-39.650186.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T19_34_20.159933
path:
- '**/details_harness|winogrande|5_2023-10-24T19-34-20.159933.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T19-34-20.159933.parquet'
- config_name: results
data_files:
- split: 2023_10_11T04_08_39.650186
path:
- results_2023-10-11T04-08-39.650186.parquet
- split: 2023_10_24T19_34_20.159933
path:
- results_2023-10-24T19-34-20.159933.parquet
- split: latest
path:
- results_2023-10-24T19-34-20.159933.parquet
---
# Dataset Card for Evaluation run of openaccess-ai-collective/jackalope-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/openaccess-ai-collective/jackalope-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [openaccess-ai-collective/jackalope-7b](https://huggingface.co/openaccess-ai-collective/jackalope-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_openaccess-ai-collective__jackalope-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T19:34:20.159933](https://huggingface.co/datasets/open-llm-leaderboard/details_openaccess-ai-collective__jackalope-7b/blob/main/results_2023-10-24T19-34-20.159933.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.008703859060402684,
"em_stderr": 0.0009512557261398897,
"f1": 0.07785130033557026,
"f1_stderr": 0.0016803312427089365,
"acc": 0.5335823999071311,
"acc_stderr": 0.012043055014472743
},
"harness|drop|3": {
"em": 0.008703859060402684,
"em_stderr": 0.0009512557261398897,
"f1": 0.07785130033557026,
"f1_stderr": 0.0016803312427089365
},
"harness|gsm8k|5": {
"acc": 0.28658074298711145,
"acc_stderr": 0.012454841668337704
},
"harness|winogrande|5": {
"acc": 0.7805840568271507,
"acc_stderr": 0.01163126836060778
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,766 | [
[
-0.037078857421875,
-0.046783447265625,
0.003131866455078125,
0.01025390625,
-0.00942230224609375,
0.00870513916015625,
-0.031829833984375,
-0.020294189453125,
0.03619384765625,
0.042266845703125,
-0.047515869140625,
-0.06878662109375,
-0.048126220703125,
0.... |
W1lson/Book4 | 2023-10-11T04:44:17.000Z | [
"region:us"
] | W1lson | null | null | 0 | 0 | 2023-10-11T04:34:39 | ---
dataset_info:
features:
- name: Source ID
dtype: int64
- name: Primary Text
dtype: string
splits:
- name: train
num_bytes: 9831
num_examples: 87
download_size: 0
dataset_size: 9831
---
# Dataset Card for "Book4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 378 | [
[
-0.041259765625,
0.003787994384765625,
0.0091094970703125,
0.003993988037109375,
-0.011383056640625,
-0.009490966796875,
0.0323486328125,
-0.0150909423828125,
0.035614013671875,
0.041656494140625,
-0.056915283203125,
-0.05926513671875,
-0.028961181640625,
-0... |
Ka4on/ultrasound_sample | 2023-10-11T04:43:17.000Z | [
"region:us"
] | Ka4on | null | null | 0 | 0 | 2023-10-11T04:42:54 | Entry not found | 15 | [
[
-0.021392822265625,
-0.0149688720703125,
0.057220458984375,
0.0288238525390625,
-0.03509521484375,
0.046539306640625,
0.052520751953125,
0.005046844482421875,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.01495361328125,
-0.060333251953125,
0.03... |
HUNTERDEBASTADOR/models2 | 2023-10-14T14:58:14.000Z | [
"region:us"
] | HUNTERDEBASTADOR | null | null | 0 | 0 | 2023-10-11T05:05:13 | Entry not found | 15 | [
[
-0.021392822265625,
-0.0149688720703125,
0.057220458984375,
0.0288238525390625,
-0.03509521484375,
0.046539306640625,
0.052520751953125,
0.005046844482421875,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.01495361328125,
-0.060333251953125,
0.03... |
AlignmentLab-AI/Caption-Creation1.0 | 2023-10-11T07:31:42.000Z | [
"region:us"
] | AlignmentLab-AI | null | null | 0 | 0 | 2023-10-11T05:14:19 | Entry not found | 15 | [
[
-0.021392822265625,
-0.0149688720703125,
0.057220458984375,
0.0288238525390625,
-0.03509521484375,
0.046539306640625,
0.052520751953125,
0.005046844482421875,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.01495361328125,
-0.060333251953125,
0.03... |
autoevaluate/autoeval-eval-amazon_reviews_multi-ja-0b8fc2-94377146065 | 2023-10-11T05:19:17.000Z | [
"region:us"
] | autoevaluate | null | null | 0 | 0 | 2023-10-11T05:19:13 | Entry not found | 15 | [
[
-0.021392822265625,
-0.0149688720703125,
0.057220458984375,
0.0288238525390625,
-0.03509521484375,
0.046539306640625,
0.052520751953125,
0.005046844482421875,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.01495361328125,
-0.060333251953125,
0.03... |
MananSantoki/llama-test | 2023-10-11T05:37:34.000Z | [
"region:us"
] | MananSantoki | null | null | 0 | 0 | 2023-10-11T05:37:34 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
progenifixofficial/progenifix | 2023-10-11T06:11:21.000Z | [
"region:us"
] | progenifixofficial | null | null | 0 | 0 | 2023-10-11T06:11:04 | <h2 style="text-align: left;"><a href="https://www.globalfitnessmart.com/get-progenifix"><span style="background-color: #ffcc00; color: blue;"><strong>{</strong><strong>Progenifix - Official Website -- Order Now}</strong></span></a></h2>
<h2><strong>➡️<span style="color: #ff9900;">● For Order Official Website - <a href="https://www.globalfitnessmart.com/get-progenifix">https://www.globalfitnessmart.com/get-progenifix</a></span></strong><br /><strong>➡️<span style="color: #33cccc;">● Item Name: — <a href="https://www.globalfitnessmart.com/get-progenifix">Progenifix</a></span></strong><br /><strong>➡️<span style="color: #99cc00;">● Ingredients: — All Natural</span></strong><br /><strong>➡️<span style="color: #339966;">● Incidental Effects: — NA</span></strong><br /><strong>➡️<span style="color: purple;"><span style="color: red;">● Accessibility: — <a href="https://www.globalfitnessmart.com/get-progenifix">Online</a></span><br /></span></strong></h2>
<h2><a href="https://www.globalfitnessmart.com/get-progenifix"><strong><span style="background-color: #ffcc00; color: blue;">➡️Hurry Up - Limited Time Offer - Purchase Now</span></strong></a><br /><a href="https://www.globalfitnessmart.com/get-progenifix"><strong><span style="background-color: #ffcc00; color: blue;">➡️Hurry Up - Limited Time Offer - Purchase Now</span></strong></a><br /><a href="https://www.globalfitnessmart.com/get-progenifix"><strong><span style="background-color: #ffcc00; color: blue;">➡️Hurry Up - Limited Time Offer - Purchase Now</span></strong></a></h2>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.globalfitnessmart.com/get-progenifix"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiIb1ZBKUI-c3pbno8_SGlZzeB2XmfIgAcDtuAGXaq2SzcI2uUX2EhU5QZmLGCDEIM22ljE0oxK7v9tv9L_Ash8gQg3qC9HNKQvQbLmmNjjDvTiI8DES1v1jiRh8BfxNAhU-tRNR8ZrEJEhqsSgYNVfTLcHoE9wrHigWjR9zfdO4SSsRpKJfGg4nhIfFw8p/w640-h480/Progenifix%2010.png" alt="" width="640" height="480" border="0" data-original-height="417" data-original-width="556" /></a></div>
<h2><strong>Progenifix Weight Loss Supplement: Reviews and Extensive guide 2023 Read Before</strong></h2>
<p>Progenifix is a supplement that helps consumers to improve weight loss and support good health. The formula is easy to take every day, though some consumers will notice that they burn through weight more rapidly than they ever have before.</p>
<h2><strong>What is Progenifix?</strong></h2>
<p>If you are looking to lose weight, you may have heard of Progenifix. This weight loss supplement is designed to help you burn fat and lose weight quickly. But what is Progenifix and how does it work? In this article, we will take a closer look at the Progenifix weight loss supplement and provide an in-depth review.</p>
<p>Progenifix is a weight loss supplement that contains a powerful blend of ingredients that are designed to help you burn fat and lose weight quickly. Other ingredients in Progenifix include green coffee bean extract, green tea extract, and African mango seed extract. These ingredients are all known to be effective for weight loss and they work together to help you burn fat quickly.</p>
<h2 style="text-align: center;"><span style="background-color: #ffcc00; color: #0000ff;"><a style="background-color: #ffcc00; color: #0000ff;" href="https://www.globalfitnessmart.com/get-progenifix"><strong>SPECIAL PROMO: Get Progenifix at the Lowest Discounted Price Online</strong></a></span></h2>
<h2><strong>How Does Progenifix Work?</strong></h2>
<p>Progenifix is a supplement that utilizes natural, scientifically supported ingredients to enhance weight loss results and improve overall well-being. According to the product's official website, it has three primary benefits, which are:</p>
<ul>
<li>Supporting weight loss</li>
<li>Promoting wellness and vitality</li>
<li>Supporting immune system</li>
</ul>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.globalfitnessmart.com/get-progenifix"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjnniR87uocacvkq4TTxlfwddQU1kvf3128afknQr-lWE2a82lZWewC7W5Ik-UwQUYMZox5t7T3SqHzcT85N3rlokkVNZooQaGQMKhstlhrv3MaTvGG9TH1BxttNJc-m8p9JF840IhyphenhyphenpcTHnUEnF1mbYT0CXCv7wX23Spp3b9_FadKCc2kO8tOl5VpUjQKj/w640-h360/Progenifix-Ingredients.webp" alt="" width="640" height="360" border="0" data-original-height="1260" data-original-width="2240" /></a></div>
<h2><strong>What is in Progenifix Natural Ingredients</strong></h2>
<p>Exclusively offering natural ingredients, the Progenifix formula includes the following mushrooms:</p>
<p><strong>#Royal Sun Agaricus</strong></p>
<p>Royal Sun Mushrooms are incredibly supportive of the immune system and its response to potential health threats. With a high abundance of immunomodulating polysaccharides, consumers inherently reduce their risk of infection, allergic reactions, and even asthma, according to early research with mice. It can also reduce inflammation, especially for consumers with inflammatory bowel disease.</p>
<p><strong>#Cordyceps Sinensis</strong></p>
<p>Cordyceps Sinensis is a common mushroom, and traditional healers use it in Sikkim. These experts believe this mushroom can help with all kinds of ailments when treated like a tonic. With the proper preparation, the creators sometimes use it to boost energy, regulate the appetite, and promote better endurance and stamina.</p>
<p><strong>#Chaga</strong></p>
<p>Based on current evidence, Chaga mushrooms have antioxidants that can soothe arthritis and reduce high blood pressure. This mushroom eases inflammation throughout the body, including the stomach lining and the joints. By dealing with inflammation, consumers can adequately digest their food without pain.</p>
<p><strong>#Lion's Mane</strong></p>
<p>Lion's Mane Mushrooms are quite a sight to see, and they are just as beneficial as they are interesting to look at. Current research links the use of lion's mane mushrooms to preventing problems like dementia, heart disease, and cancer. When used by animals, this mushroom reduces the risk of diabetes as well.<br />Many of the benefits consumers reap with Lion's Mane Mushroom are in the brain.</p>
<p><strong>#White Button</strong></p>
<p>The final mushroom of this blend is usually found in the produce section of even the smallest grocery stores – white button mushrooms. They embody what consumers often think of when they hear "mushrooms," but they can also greatly benefit the user's health.</p>
<h2 style="text-align: center;"><span style="background-color: #ffcc00; color: #0000ff;"><a style="background-color: #ffcc00; color: #0000ff;" href="https://www.globalfitnessmart.com/get-progenifix"><strong>SPECIAL PROMO[Limited Discount]: "Progenifix USA"Official Website!</strong></a></span></h2>
<h2><strong>The Benefits of Progenifix Supplement</strong></h2>
<p>● Progenifix supplement helps eliminate excess fat stored in the body</p>
<p>● The ingredients in the Progenifix formula are rich in fiber which helps suppress appetite and food cravings</p>
<p>● Progenifix is rich in anti-aging compounds which support metabolic functions</p>
<p>● Progenifix supplement helps improve energy levels, mood, and mental clarity</p>
<p>● The formula assists in strengthening the immune system by reducing inflammation</p>
<p>● It has antioxidants that prevent damage from free radicals and oxidative stress</p>
<p>● Progenifix formula can treat digestive and circulatory disorders and reduce the symptoms of diabetes</p>
<p>● The supplement provides the necessary nutrients the body needs to stay healthy</p>
<p>● The supplement targets the root cause of slow metabolism and restores regular metabolic activity</p>
<p>● The formula provides better sleep and boosts confidence and self-esteem</p>
<h2><strong>How to Use Progenifix</strong></h2>
<p>Assuming you are referring to the weight loss supplement known as Progenifix: When using any weight loss supplement, it is important to follow the instructions on the product label. It is also important to consult with a healthcare professional before starting any new supplement, especially if you have any medical conditions or are taking any medications.</p>
<p>Progenifix is a dietary supplement that comes in capsule form. The recommended dose is two capsules per day, taken with water. It is best to take the capsules with meals. For best results, it is recommended to use Progenifix for at least 8 weeks.</p>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.globalfitnessmart.com/get-progenifix"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj5Vy_IcQ9rpiawvyigHopsJt-HE9IHu3YESbkI8uX_eQfQF4ZjVFm1ABW4ZioTHDgbstKigkedI7ZguHXPwRfiD9qM_X6HCFE6iWcCzLtrOqwUjLd8X_O1YcA8Qz5kZI9up7IZxjMer3T57voKXMPy7q_uSfI2XD4i6CYe__3zZgTqnzR70CyY52q-7vAz/w266-h400/Progenifix%20weight%20loss.jpg" alt="" width="266" height="400" border="0" data-original-height="1104" data-original-width="736" /></a></div>
<h2><strong>Frequently Asked Questions About Progenifix</strong></h2>
<p><strong>Q - How often should the Progenifix formula be taken?</strong></p>
<p>A - Users must take two capsules daily to get the desired results. The best time of day to use it is in the morning.</p>
<p><strong>Q - What does Progenifix taste like?</strong></p>
<p>A - Nothing! Even with the plethora of mushrooms, consumers won't have to worry about taste because it is condensed within the capsules.</p>
<p><strong>Q - How long will users have to keep using Progenifix?</strong></p>
<p>A - Since every person starts at a different place in their weight loss, they also have different paces that they go at in their progress. While the total amount of time the user needs to stick with Progenifix will change, most of the initial progress will be noticeable in the first week. Sticking with this regimen for any time is beneficial, but users who commit to using it for longer will see the most intense changes.</p>
<p><strong>Q - Is it possible to purchase Progenifix from a different store?</strong></p>
<p>A - No. The creators want to ensure that users can get the best price possible, which is why the only place that consumers can purchase Progenifix is on the official website.</p>
<p><strong>Q - What should the user do if they lose weight too quickly?</strong></p>
<p>A - This formula is a highly effective remedy, which is why some consumers might grow concerned about how quickly they shed pounds. If this rate of weight loss is overwhelming or even alarming, they can cut the dose to no more than one capsule a day.</p>
<p><strong>Q - What is the best number of bottles to order?</strong></p>
<p>A - Each bottle contains enough Progenifix formula to last through an entire month, meaning users should purchase the same number of bottles as the number of months they want to use it. Taking the formula for 3-6 months reaps the best rewards, giving users the best price on their order.</p>
<p><strong>Q - What if the formula doesn't work for the user?</strong></p>
<p>A - If the user finds that the Progenifix formula doesn't help with their weight loss, they can get a full refund with a money-back guarantee within 60 days. A refund can be established with the customer service team before sending back any products.</p>
<h2 style="text-align: center;"><span style="background-color: #ffcc00; color: #0000ff;"><a style="background-color: #ffcc00; color: #0000ff;" href="https://www.globalfitnessmart.com/get-progenifix"><strong>(EXCLUSIVE OFFER)Click Here : "Progenifix USA"Official Website!</strong></a></span></h2>
<h2><strong>Buying a Bottle of Progenifix</strong></h2>
<p>The only way consumers can order Progenifix is by visiting the official website . While three options are available for the packages, only consumers who demand the most significant quantity (6 bottles) will get free shipping on their purchase.</p>
<p>The available packages include the following:</p>
<p><strong>● One bottle for $69</strong></p>
<p><strong>● Three bottles for $177 ($59 per bottle)</strong></p>
<p><strong>● Six bottles for $294 ($49 per bottle)</strong></p>
<h2><strong>Conclusion</strong></h2>
<p>In conclusion, Progenifix is a great weight loss supplement that can help you achieve your weight goals with its natural ingredients and superb formulation. It has received favorable reviews from customers who have tried it and reported positive results in terms of energy levels and fat burning capabilities. We highly recommend giving this product a try if you are looking for an effective way to lose weight without any side effects. weight loss supplements.</p>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.globalfitnessmart.com/get-progenifix"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjkmDXhB5tFDMCVQJ3hv3xh5qTSt7Zo748CAfFLOhbx_TraK6GZdQ1-Qiy-KYQCsezLo5-M1eppklbei6Yp4c-esFrYezEoU_xUIip55gCUkBbLjaZiGgTJBwA2Rg4V5MLllp67NOz_62p-9F1MsEKMIaS-Mynx5QQeoythx6P0GZeNv06E1J2q-PK78kwM/w640-h422/Progenifix%20price.jpg" alt="" width="640" height="422" border="0" data-original-height="413" data-original-width="625" /></a></div>
<h2 style="text-align: center;"><span style="background-color: #ffcc00; color: #0000ff;"><a style="background-color: #ffcc00; color: #0000ff;" href="https://www.globalfitnessmart.com/get-progenifix"><strong>(EXCLUSIVE OFFER)Click Here : "Progenifix USA"Official Website!</strong></a></span></h2>
<h2><strong># Read More</strong></h2>
<p><strong><a href="https://progenifix-official.clubeo.com">https://progenifix-official.clubeo.com</a></strong></p>
<p><strong><a href="https://progenifix-official.clubeo.com/page/progenifix-is-legit-2023-updated-report.html">https://progenifix-official.clubeo.com/page/progenifix-is-legit-2023-updated-report.html</a></strong></p>
<p><strong><a href="https://progenifix-official.clubeo.com/page/progenifix-review-2023-does-it-really-work-for-weight-loss.html">https://progenifix-official.clubeo.com/page/progenifix-review-2023-does-it-really-work-for-weight-loss.html</a></strong></p>
<p><strong><a href="https://progenifix-official.clubeo.com/calendar/2023/10/19/progenifix-exposed-reviews-2023-legit-scam-alert">https://progenifix-official.clubeo.com/calendar/2023/10/19/progenifix-exposed-reviews-2023-legit-scam-alert</a></strong></p>
<p><strong><a href="https://groups.google.com/g/progenifix-official-us/c/xeAEDlM-BhM">https://groups.google.com/g/progenifix-official-us/c/xeAEDlM-BhM</a></strong></p>
<p> </p> | 14,740 | [
[
-0.03411865234375,
-0.0281982421875,
0.0128631591796875,
0.0153045654296875,
-0.0076446533203125,
0.01503753662109375,
0.021392822265625,
-0.046844482421875,
0.06536865234375,
0.023529052734375,
-0.07794189453125,
-0.023834228515625,
-0.028564453125,
0.02757... |
autoevaluate/autoeval-eval-acronym_identification-default-f41302-94382146071 | 2023-10-11T06:17:47.000Z | [
"region:us"
] | autoevaluate | null | null | 0 | 0 | 2023-10-11T06:17:44 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
open-llm-leaderboard/details_uukuguy__speechless-code-mistral-orca-7b-v1.0 | 2023-10-24T15:07:25.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-11T06:18:03 | ---
pretty_name: Evaluation run of uukuguy/speechless-code-mistral-orca-7b-v1.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [uukuguy/speechless-code-mistral-orca-7b-v1.0](https://huggingface.co/uukuguy/speechless-code-mistral-orca-7b-v1.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__speechless-code-mistral-orca-7b-v1.0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T15:07:12.352820](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-code-mistral-orca-7b-v1.0/blob/main/results_2023-10-24T15-07-12.352820.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.4526006711409396,\n\
\ \"em_stderr\": 0.005097407791242309,\n \"f1\": 0.4989010067114103,\n\
\ \"f1_stderr\": 0.004905672332696013,\n \"acc\": 0.42884877867222604,\n\
\ \"acc_stderr\": 0.009659566392137438\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.4526006711409396,\n \"em_stderr\": 0.005097407791242309,\n\
\ \"f1\": 0.4989010067114103,\n \"f1_stderr\": 0.004905672332696013\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08263836239575435,\n \
\ \"acc_stderr\": 0.0075840892201481476\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7750591949486977,\n \"acc_stderr\": 0.01173504356412673\n\
\ }\n}\n```"
repo_url: https://huggingface.co/uukuguy/speechless-code-mistral-orca-7b-v1.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|arc:challenge|25_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T15_07_12.352820
path:
- '**/details_harness|drop|3_2023-10-24T15-07-12.352820.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T15-07-12.352820.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T15_07_12.352820
path:
- '**/details_harness|gsm8k|5_2023-10-24T15-07-12.352820.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T15-07-12.352820.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hellaswag|10_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T06-17-39.611971.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T06-17-39.611971.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T06-17-39.611971.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T15_07_12.352820
path:
- '**/details_harness|winogrande|5_2023-10-24T15-07-12.352820.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T15-07-12.352820.parquet'
- config_name: results
data_files:
- split: 2023_10_11T06_17_39.611971
path:
- results_2023-10-11T06-17-39.611971.parquet
- split: 2023_10_24T15_07_12.352820
path:
- results_2023-10-24T15-07-12.352820.parquet
- split: latest
path:
- results_2023-10-24T15-07-12.352820.parquet
---
# Dataset Card for Evaluation run of uukuguy/speechless-code-mistral-orca-7b-v1.0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/uukuguy/speechless-code-mistral-orca-7b-v1.0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [uukuguy/speechless-code-mistral-orca-7b-v1.0](https://huggingface.co/uukuguy/speechless-code-mistral-orca-7b-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_uukuguy__speechless-code-mistral-orca-7b-v1.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T15:07:12.352820](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-code-mistral-orca-7b-v1.0/blob/main/results_2023-10-24T15-07-12.352820.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.4526006711409396,
"em_stderr": 0.005097407791242309,
"f1": 0.4989010067114103,
"f1_stderr": 0.004905672332696013,
"acc": 0.42884877867222604,
"acc_stderr": 0.009659566392137438
},
"harness|drop|3": {
"em": 0.4526006711409396,
"em_stderr": 0.005097407791242309,
"f1": 0.4989010067114103,
"f1_stderr": 0.004905672332696013
},
"harness|gsm8k|5": {
"acc": 0.08263836239575435,
"acc_stderr": 0.0075840892201481476
},
"harness|winogrande|5": {
"acc": 0.7750591949486977,
"acc_stderr": 0.01173504356412673
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,834 | [
[
-0.024444580078125,
-0.04656982421875,
0.01029205322265625,
0.015899658203125,
-0.01477813720703125,
0.0010862350463867188,
-0.0212860107421875,
-0.018585205078125,
0.0280303955078125,
0.044708251953125,
-0.04534912109375,
-0.0699462890625,
-0.0426025390625,
... |
artdwn/work | 2023-10-11T07:45:11.000Z | [
"region:us"
] | artdwn | null | null | 0 | 0 | 2023-10-11T06:43:49 | Entry not found | 15 | [
[
-0.021392822265625,
-0.01494598388671875,
0.05718994140625,
0.028839111328125,
-0.0350341796875,
0.046539306640625,
0.052490234375,
0.00507354736328125,
0.051361083984375,
0.01702880859375,
-0.052093505859375,
-0.01494598388671875,
-0.06036376953125,
0.03790... |
MONQ/images | 2023-10-11T06:48:28.000Z | [
"region:us"
] | MONQ | null | null | 0 | 0 | 2023-10-11T06:47:57 | Entry not found | 15 | [
[
-0.021392822265625,
-0.01494598388671875,
0.05718994140625,
0.028839111328125,
-0.0350341796875,
0.046539306640625,
0.052490234375,
0.00507354736328125,
0.051361083984375,
0.01702880859375,
-0.052093505859375,
-0.01494598388671875,
-0.06036376953125,
0.03790... |
aeris99/Capstone | 2023-10-11T06:51:25.000Z | [
"region:us"
] | aeris99 | null | null | 0 | 0 | 2023-10-11T06:51:25 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
tinhpx2911/truyenfull_processed | 2023-10-11T07:33:07.000Z | [
"region:us"
] | tinhpx2911 | null | null | 0 | 0 | 2023-10-11T06:52:59 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
- name: link
dtype: string
- name: title
dtype: string
splits:
- name: train
num_bytes: 9414309891
num_examples: 665475
download_size: 2167644522
dataset_size: 9414309891
---
# Dataset Card for "truyenfull_processed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 526 | [
[
-0.0244598388671875,
-0.0305328369140625,
0.01163482666015625,
0.02056884765625,
-0.0269622802734375,
-0.0001366138458251953,
0.0162811279296875,
-0.028411865234375,
0.0616455078125,
0.04736328125,
-0.0687255859375,
-0.0609130859375,
-0.045989990234375,
-0.0... |
ilyas3141/test_ilias24 | 2023-10-11T06:56:40.000Z | [
"region:us"
] | ilyas3141 | null | null | 0 | 0 | 2023-10-11T06:56:21 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
ilyas3141/test_ilias_test | 2023-10-11T07:00:54.000Z | [
"region:us"
] | ilyas3141 | null | null | 0 | 0 | 2023-10-11T06:57:49 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
open-llm-leaderboard/details_ehartford__dolphin-2.1-mistral-7b | 2023-10-28T06:17:24.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-11T07:08:34 | ---
pretty_name: Evaluation run of ehartford/dolphin-2.1-mistral-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ehartford/dolphin-2.1-mistral-7b](https://huggingface.co/ehartford/dolphin-2.1-mistral-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ehartford__dolphin-2.1-mistral-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-28T06:17:12.096857](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__dolphin-2.1-mistral-7b/blob/main/results_2023-10-28T06-17-12.096857.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0025167785234899327,\n\
\ \"em_stderr\": 0.0005131152834514602,\n \"f1\": 0.07557885906040251,\n\
\ \"f1_stderr\": 0.0015806922251337756,\n \"acc\": 0.49258006202828786,\n\
\ \"acc_stderr\": 0.011432753263209281\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0025167785234899327,\n \"em_stderr\": 0.0005131152834514602,\n\
\ \"f1\": 0.07557885906040251,\n \"f1_stderr\": 0.0015806922251337756\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.20773313115996966,\n \
\ \"acc_stderr\": 0.011174572716705898\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7774269928966061,\n \"acc_stderr\": 0.011690933809712662\n\
\ }\n}\n```"
repo_url: https://huggingface.co/ehartford/dolphin-2.1-mistral-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|arc:challenge|25_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|arc:challenge|25_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_26T09_35_25.636267
path:
- '**/details_harness|drop|3_2023-10-26T09-35-25.636267.parquet'
- split: 2023_10_28T06_17_12.096857
path:
- '**/details_harness|drop|3_2023-10-28T06-17-12.096857.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-28T06-17-12.096857.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_26T09_35_25.636267
path:
- '**/details_harness|gsm8k|5_2023-10-26T09-35-25.636267.parquet'
- split: 2023_10_28T06_17_12.096857
path:
- '**/details_harness|gsm8k|5_2023-10-28T06-17-12.096857.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-28T06-17-12.096857.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hellaswag|10_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hellaswag|10_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T07-08-11.393844.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T07-16-54.692993.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T07-08-11.393844.parquet'
- split: 2023_10_11T07_16_54.692993
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T07-16-54.692993.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T07-16-54.692993.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_26T09_35_25.636267
path:
- '**/details_harness|winogrande|5_2023-10-26T09-35-25.636267.parquet'
- split: 2023_10_28T06_17_12.096857
path:
- '**/details_harness|winogrande|5_2023-10-28T06-17-12.096857.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-28T06-17-12.096857.parquet'
- config_name: results
data_files:
- split: 2023_10_11T07_08_11.393844
path:
- results_2023-10-11T07-08-11.393844.parquet
- split: 2023_10_11T07_16_54.692993
path:
- results_2023-10-11T07-16-54.692993.parquet
- split: 2023_10_26T09_35_25.636267
path:
- results_2023-10-26T09-35-25.636267.parquet
- split: 2023_10_28T06_17_12.096857
path:
- results_2023-10-28T06-17-12.096857.parquet
- split: latest
path:
- results_2023-10-28T06-17-12.096857.parquet
---
# Dataset Card for Evaluation run of ehartford/dolphin-2.1-mistral-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ehartford/dolphin-2.1-mistral-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ehartford/dolphin-2.1-mistral-7b](https://huggingface.co/ehartford/dolphin-2.1-mistral-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ehartford__dolphin-2.1-mistral-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-28T06:17:12.096857](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__dolphin-2.1-mistral-7b/blob/main/results_2023-10-28T06-17-12.096857.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0025167785234899327,
"em_stderr": 0.0005131152834514602,
"f1": 0.07557885906040251,
"f1_stderr": 0.0015806922251337756,
"acc": 0.49258006202828786,
"acc_stderr": 0.011432753263209281
},
"harness|drop|3": {
"em": 0.0025167785234899327,
"em_stderr": 0.0005131152834514602,
"f1": 0.07557885906040251,
"f1_stderr": 0.0015806922251337756
},
"harness|gsm8k|5": {
"acc": 0.20773313115996966,
"acc_stderr": 0.011174572716705898
},
"harness|winogrande|5": {
"acc": 0.7774269928966061,
"acc_stderr": 0.011690933809712662
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 53,396 | [
[
-0.037200927734375,
-0.0465087890625,
0.01629638671875,
0.013092041015625,
-0.01354217529296875,
-0.0006918907165527344,
-0.01971435546875,
-0.0200347900390625,
0.029510498046875,
0.04254150390625,
-0.050689697265625,
-0.06365966796875,
-0.049560546875,
0.01... |
tsou2017/tsou1992 | 2023-10-11T07:12:03.000Z | [
"region:us"
] | tsou2017 | null | null | 0 | 0 | 2023-10-11T07:12:03 | Entry not found | 15 | [
[
-0.0213775634765625,
-0.01497650146484375,
0.05718994140625,
0.02880859375,
-0.0350341796875,
0.046478271484375,
0.052490234375,
0.00507354736328125,
0.051361083984375,
0.0170135498046875,
-0.052093505859375,
-0.01497650146484375,
-0.0604248046875,
0.0379028... |
open-llm-leaderboard/details_AA051610__VA | 2023-10-11T07:23:50.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-11T07:22:48 | ---
pretty_name: Evaluation run of AA051610/VA
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AA051610/VA](https://huggingface.co/AA051610/VA) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AA051610__VA\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-11T07:22:26.417131](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051610__VA/blob/main/results_2023-10-11T07-22-26.417131.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4972405415581996,\n\
\ \"acc_stderr\": 0.03512578000813228,\n \"acc_norm\": 0.5002960487991649,\n\
\ \"acc_norm_stderr\": 0.03512615731416433,\n \"mc1\": 0.28518971848225216,\n\
\ \"mc1_stderr\": 0.015805827874454892,\n \"mc2\": 0.44928868954080875,\n\
\ \"mc2_stderr\": 0.014916546411376396\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3848122866894198,\n \"acc_stderr\": 0.014218371065251105,\n\
\ \"acc_norm\": 0.4138225255972696,\n \"acc_norm_stderr\": 0.014392730009221007\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.47390957976498704,\n\
\ \"acc_stderr\": 0.004982983592459198,\n \"acc_norm\": 0.6251742680740888,\n\
\ \"acc_norm_stderr\": 0.004830885704380092\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.45394736842105265,\n \"acc_stderr\": 0.04051646342874142,\n\
\ \"acc_norm\": 0.45394736842105265,\n \"acc_norm_stderr\": 0.04051646342874142\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n\
\ \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5547169811320755,\n \"acc_stderr\": 0.030588052974270658,\n\
\ \"acc_norm\": 0.5547169811320755,\n \"acc_norm_stderr\": 0.030588052974270658\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5069444444444444,\n\
\ \"acc_stderr\": 0.04180806750294938,\n \"acc_norm\": 0.5069444444444444,\n\
\ \"acc_norm_stderr\": 0.04180806750294938\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5144508670520231,\n\
\ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.5144508670520231,\n\
\ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.03873958714149351,\n\
\ \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.03873958714149351\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.63,\n \"acc_stderr\": 0.048523658709390974,\n \"acc_norm\": 0.63,\n\
\ \"acc_norm_stderr\": 0.048523658709390974\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4553191489361702,\n \"acc_stderr\": 0.032555253593403555,\n\
\ \"acc_norm\": 0.4553191489361702,\n \"acc_norm_stderr\": 0.032555253593403555\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n\
\ \"acc_stderr\": 0.04462917535336936,\n \"acc_norm\": 0.34210526315789475,\n\
\ \"acc_norm_stderr\": 0.04462917535336936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728762,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728762\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.328042328042328,\n \"acc_stderr\": 0.024180497164376896,\n \"\
acc_norm\": 0.328042328042328,\n \"acc_norm_stderr\": 0.024180497164376896\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n\
\ \"acc_stderr\": 0.03970158273235172,\n \"acc_norm\": 0.2698412698412698,\n\
\ \"acc_norm_stderr\": 0.03970158273235172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.567741935483871,\n\
\ \"acc_stderr\": 0.028181739720019416,\n \"acc_norm\": 0.567741935483871,\n\
\ \"acc_norm_stderr\": 0.028181739720019416\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.35960591133004927,\n \"acc_stderr\": 0.033764582465095665,\n\
\ \"acc_norm\": 0.35960591133004927,\n \"acc_norm_stderr\": 0.033764582465095665\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6121212121212121,\n \"acc_stderr\": 0.038049136539710114,\n\
\ \"acc_norm\": 0.6121212121212121,\n \"acc_norm_stderr\": 0.038049136539710114\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5858585858585859,\n \"acc_stderr\": 0.03509438348879629,\n \"\
acc_norm\": 0.5858585858585859,\n \"acc_norm_stderr\": 0.03509438348879629\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6269430051813472,\n \"acc_stderr\": 0.03490205592048573,\n\
\ \"acc_norm\": 0.6269430051813472,\n \"acc_norm_stderr\": 0.03490205592048573\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.44871794871794873,\n \"acc_stderr\": 0.025217315184846475,\n\
\ \"acc_norm\": 0.44871794871794873,\n \"acc_norm_stderr\": 0.025217315184846475\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5252100840336135,\n \"acc_stderr\": 0.03243718055137411,\n \
\ \"acc_norm\": 0.5252100840336135,\n \"acc_norm_stderr\": 0.03243718055137411\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6128440366972477,\n \"acc_stderr\": 0.02088423199264345,\n \"\
acc_norm\": 0.6128440366972477,\n \"acc_norm_stderr\": 0.02088423199264345\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.39814814814814814,\n \"acc_stderr\": 0.033384734032074016,\n \"\
acc_norm\": 0.39814814814814814,\n \"acc_norm_stderr\": 0.033384734032074016\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6372549019607843,\n \"acc_stderr\": 0.03374499356319355,\n \"\
acc_norm\": 0.6372549019607843,\n \"acc_norm_stderr\": 0.03374499356319355\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.70042194092827,\n \"acc_stderr\": 0.029818024749753095,\n \
\ \"acc_norm\": 0.70042194092827,\n \"acc_norm_stderr\": 0.029818024749753095\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n\
\ \"acc_stderr\": 0.03259625118416827,\n \"acc_norm\": 0.6188340807174888,\n\
\ \"acc_norm_stderr\": 0.03259625118416827\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5648854961832062,\n \"acc_stderr\": 0.04348208051644858,\n\
\ \"acc_norm\": 0.5648854961832062,\n \"acc_norm_stderr\": 0.04348208051644858\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6694214876033058,\n \"acc_stderr\": 0.04294340845212093,\n \"\
acc_norm\": 0.6694214876033058,\n \"acc_norm_stderr\": 0.04294340845212093\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6574074074074074,\n\
\ \"acc_stderr\": 0.045879047413018105,\n \"acc_norm\": 0.6574074074074074,\n\
\ \"acc_norm_stderr\": 0.045879047413018105\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5153374233128835,\n \"acc_stderr\": 0.03926522378708843,\n\
\ \"acc_norm\": 0.5153374233128835,\n \"acc_norm_stderr\": 0.03926522378708843\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.04521829902833586,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.04521829902833586\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6504854368932039,\n \"acc_stderr\": 0.047211885060971716,\n\
\ \"acc_norm\": 0.6504854368932039,\n \"acc_norm_stderr\": 0.047211885060971716\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.029343114798094462,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.029343114798094462\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.0498887651569859,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.0498887651569859\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6781609195402298,\n\
\ \"acc_stderr\": 0.0167063814150579,\n \"acc_norm\": 0.6781609195402298,\n\
\ \"acc_norm_stderr\": 0.0167063814150579\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5751445086705202,\n \"acc_stderr\": 0.02661335084026174,\n\
\ \"acc_norm\": 0.5751445086705202,\n \"acc_norm_stderr\": 0.02661335084026174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25251396648044694,\n\
\ \"acc_stderr\": 0.014530330201468628,\n \"acc_norm\": 0.25251396648044694,\n\
\ \"acc_norm_stderr\": 0.014530330201468628\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4934640522875817,\n \"acc_stderr\": 0.028627470550556054,\n\
\ \"acc_norm\": 0.4934640522875817,\n \"acc_norm_stderr\": 0.028627470550556054\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5594855305466238,\n\
\ \"acc_stderr\": 0.028196400574197426,\n \"acc_norm\": 0.5594855305466238,\n\
\ \"acc_norm_stderr\": 0.028196400574197426\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.02774431344337654,\n\
\ \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.02774431344337654\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.43617021276595747,\n \"acc_stderr\": 0.029583452036284073,\n \
\ \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.029583452036284073\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44198174706649285,\n\
\ \"acc_stderr\": 0.012683972513598806,\n \"acc_norm\": 0.44198174706649285,\n\
\ \"acc_norm_stderr\": 0.012683972513598806\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4852941176470588,\n \"acc_stderr\": 0.03035969707904612,\n\
\ \"acc_norm\": 0.4852941176470588,\n \"acc_norm_stderr\": 0.03035969707904612\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5343137254901961,\n \"acc_stderr\": 0.02018014484330729,\n \
\ \"acc_norm\": 0.5343137254901961,\n \"acc_norm_stderr\": 0.02018014484330729\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n\
\ \"acc_stderr\": 0.04673752333670237,\n \"acc_norm\": 0.6090909090909091,\n\
\ \"acc_norm_stderr\": 0.04673752333670237\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5673469387755102,\n \"acc_stderr\": 0.031717528240626645,\n\
\ \"acc_norm\": 0.5673469387755102,\n \"acc_norm_stderr\": 0.031717528240626645\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6467661691542289,\n\
\ \"acc_stderr\": 0.03379790611796777,\n \"acc_norm\": 0.6467661691542289,\n\
\ \"acc_norm_stderr\": 0.03379790611796777\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6081871345029239,\n \"acc_stderr\": 0.03743979825926401,\n\
\ \"acc_norm\": 0.6081871345029239,\n \"acc_norm_stderr\": 0.03743979825926401\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28518971848225216,\n\
\ \"mc1_stderr\": 0.015805827874454892,\n \"mc2\": 0.44928868954080875,\n\
\ \"mc2_stderr\": 0.014916546411376396\n }\n}\n```"
repo_url: https://huggingface.co/AA051610/VA
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|arc:challenge|25_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hellaswag|10_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T07-22-26.417131.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T07-22-26.417131.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T07-22-26.417131.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T07-22-26.417131.parquet'
- config_name: results
data_files:
- split: 2023_10_11T07_22_26.417131
path:
- results_2023-10-11T07-22-26.417131.parquet
- split: latest
path:
- results_2023-10-11T07-22-26.417131.parquet
---
# Dataset Card for Evaluation run of AA051610/VA
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/AA051610/VA
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [AA051610/VA](https://huggingface.co/AA051610/VA) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AA051610__VA",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-11T07:22:26.417131](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051610__VA/blob/main/results_2023-10-11T07-22-26.417131.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4972405415581996,
"acc_stderr": 0.03512578000813228,
"acc_norm": 0.5002960487991649,
"acc_norm_stderr": 0.03512615731416433,
"mc1": 0.28518971848225216,
"mc1_stderr": 0.015805827874454892,
"mc2": 0.44928868954080875,
"mc2_stderr": 0.014916546411376396
},
"harness|arc:challenge|25": {
"acc": 0.3848122866894198,
"acc_stderr": 0.014218371065251105,
"acc_norm": 0.4138225255972696,
"acc_norm_stderr": 0.014392730009221007
},
"harness|hellaswag|10": {
"acc": 0.47390957976498704,
"acc_stderr": 0.004982983592459198,
"acc_norm": 0.6251742680740888,
"acc_norm_stderr": 0.004830885704380092
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.45394736842105265,
"acc_stderr": 0.04051646342874142,
"acc_norm": 0.45394736842105265,
"acc_norm_stderr": 0.04051646342874142
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5547169811320755,
"acc_stderr": 0.030588052974270658,
"acc_norm": 0.5547169811320755,
"acc_norm_stderr": 0.030588052974270658
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5069444444444444,
"acc_stderr": 0.04180806750294938,
"acc_norm": 0.5069444444444444,
"acc_norm_stderr": 0.04180806750294938
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.03873958714149351,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.03873958714149351
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709390974,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709390974
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4553191489361702,
"acc_stderr": 0.032555253593403555,
"acc_norm": 0.4553191489361702,
"acc_norm_stderr": 0.032555253593403555
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.04462917535336936,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.04462917535336936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728762,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728762
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.328042328042328,
"acc_stderr": 0.024180497164376896,
"acc_norm": 0.328042328042328,
"acc_norm_stderr": 0.024180497164376896
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.03970158273235172,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.03970158273235172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.567741935483871,
"acc_stderr": 0.028181739720019416,
"acc_norm": 0.567741935483871,
"acc_norm_stderr": 0.028181739720019416
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35960591133004927,
"acc_stderr": 0.033764582465095665,
"acc_norm": 0.35960591133004927,
"acc_norm_stderr": 0.033764582465095665
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6121212121212121,
"acc_stderr": 0.038049136539710114,
"acc_norm": 0.6121212121212121,
"acc_norm_stderr": 0.038049136539710114
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5858585858585859,
"acc_stderr": 0.03509438348879629,
"acc_norm": 0.5858585858585859,
"acc_norm_stderr": 0.03509438348879629
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6269430051813472,
"acc_stderr": 0.03490205592048573,
"acc_norm": 0.6269430051813472,
"acc_norm_stderr": 0.03490205592048573
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.44871794871794873,
"acc_stderr": 0.025217315184846475,
"acc_norm": 0.44871794871794873,
"acc_norm_stderr": 0.025217315184846475
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5252100840336135,
"acc_stderr": 0.03243718055137411,
"acc_norm": 0.5252100840336135,
"acc_norm_stderr": 0.03243718055137411
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.03879687024073327,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.03879687024073327
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6128440366972477,
"acc_stderr": 0.02088423199264345,
"acc_norm": 0.6128440366972477,
"acc_norm_stderr": 0.02088423199264345
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39814814814814814,
"acc_stderr": 0.033384734032074016,
"acc_norm": 0.39814814814814814,
"acc_norm_stderr": 0.033384734032074016
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6372549019607843,
"acc_stderr": 0.03374499356319355,
"acc_norm": 0.6372549019607843,
"acc_norm_stderr": 0.03374499356319355
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.70042194092827,
"acc_stderr": 0.029818024749753095,
"acc_norm": 0.70042194092827,
"acc_norm_stderr": 0.029818024749753095
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6188340807174888,
"acc_stderr": 0.03259625118416827,
"acc_norm": 0.6188340807174888,
"acc_norm_stderr": 0.03259625118416827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5648854961832062,
"acc_stderr": 0.04348208051644858,
"acc_norm": 0.5648854961832062,
"acc_norm_stderr": 0.04348208051644858
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6694214876033058,
"acc_stderr": 0.04294340845212093,
"acc_norm": 0.6694214876033058,
"acc_norm_stderr": 0.04294340845212093
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6574074074074074,
"acc_stderr": 0.045879047413018105,
"acc_norm": 0.6574074074074074,
"acc_norm_stderr": 0.045879047413018105
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5153374233128835,
"acc_stderr": 0.03926522378708843,
"acc_norm": 0.5153374233128835,
"acc_norm_stderr": 0.03926522378708843
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833586,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833586
},
"harness|hendrycksTest-management|5": {
"acc": 0.6504854368932039,
"acc_stderr": 0.047211885060971716,
"acc_norm": 0.6504854368932039,
"acc_norm_stderr": 0.047211885060971716
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.029343114798094462,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.029343114798094462
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.0498887651569859,
"acc_norm": 0.56,
"acc_norm_stderr": 0.0498887651569859
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6781609195402298,
"acc_stderr": 0.0167063814150579,
"acc_norm": 0.6781609195402298,
"acc_norm_stderr": 0.0167063814150579
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5751445086705202,
"acc_stderr": 0.02661335084026174,
"acc_norm": 0.5751445086705202,
"acc_norm_stderr": 0.02661335084026174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25251396648044694,
"acc_stderr": 0.014530330201468628,
"acc_norm": 0.25251396648044694,
"acc_norm_stderr": 0.014530330201468628
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4934640522875817,
"acc_stderr": 0.028627470550556054,
"acc_norm": 0.4934640522875817,
"acc_norm_stderr": 0.028627470550556054
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5594855305466238,
"acc_stderr": 0.028196400574197426,
"acc_norm": 0.5594855305466238,
"acc_norm_stderr": 0.028196400574197426
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.02774431344337654,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.02774431344337654
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.43617021276595747,
"acc_stderr": 0.029583452036284073,
"acc_norm": 0.43617021276595747,
"acc_norm_stderr": 0.029583452036284073
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44198174706649285,
"acc_stderr": 0.012683972513598806,
"acc_norm": 0.44198174706649285,
"acc_norm_stderr": 0.012683972513598806
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4852941176470588,
"acc_stderr": 0.03035969707904612,
"acc_norm": 0.4852941176470588,
"acc_norm_stderr": 0.03035969707904612
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5343137254901961,
"acc_stderr": 0.02018014484330729,
"acc_norm": 0.5343137254901961,
"acc_norm_stderr": 0.02018014484330729
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.04673752333670237,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.04673752333670237
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5673469387755102,
"acc_stderr": 0.031717528240626645,
"acc_norm": 0.5673469387755102,
"acc_norm_stderr": 0.031717528240626645
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6467661691542289,
"acc_stderr": 0.03379790611796777,
"acc_norm": 0.6467661691542289,
"acc_norm_stderr": 0.03379790611796777
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6081871345029239,
"acc_stderr": 0.03743979825926401,
"acc_norm": 0.6081871345029239,
"acc_norm_stderr": 0.03743979825926401
},
"harness|truthfulqa:mc|0": {
"mc1": 0.28518971848225216,
"mc1_stderr": 0.015805827874454892,
"mc2": 0.44928868954080875,
"mc2_stderr": 0.014916546411376396
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 64,776 | [
[
-0.04998779296875,
-0.058441162109375,
0.01898193359375,
0.014251708984375,
-0.0096893310546875,
-0.00496673583984375,
0.0026111602783203125,
-0.01470947265625,
0.039947509765625,
-0.00412750244140625,
-0.03314208984375,
-0.0477294921875,
-0.03057861328125,
... |
deepghs/allinone_experiment | 2023-10-14T11:54:03.000Z | [
"region:us"
] | deepghs | null | null | 1 | 0 | 2023-10-11T07:40:54 | Entry not found | 15 | [
[
-0.0213775634765625,
-0.01497650146484375,
0.05718994140625,
0.02880859375,
-0.0350341796875,
0.046478271484375,
0.052490234375,
0.00507354736328125,
0.051361083984375,
0.0170135498046875,
-0.052093505859375,
-0.01497650146484375,
-0.0604248046875,
0.0379028... |
hongerzh/my-NFT-dataset | 2023-10-11T08:00:40.000Z | [
"region:us"
] | hongerzh | null | null | 0 | 0 | 2023-10-11T07:45:16 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': test
'1': train
'2': validation
splits:
- name: train
num_bytes: 948013.0
num_examples: 7
- name: validation
num_bytes: 169094.0
num_examples: 2
- name: test
num_bytes: 169094.0
num_examples: 2
download_size: 1290909
dataset_size: 1286201.0
---
# Dataset Card for "my-NFT-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 782 | [
[
-0.062469482421875,
-0.0269927978515625,
0.012176513671875,
0.0179595947265625,
-0.0057373046875,
0.012603759765625,
0.0287017822265625,
-0.0135650634765625,
0.080078125,
0.04071044921875,
-0.0689697265625,
-0.027069091796875,
-0.03424072265625,
0.0089645385... |
Ahmed007/test001 | 2023-10-11T07:59:21.000Z | [
"region:us"
] | Ahmed007 | null | null | 0 | 0 | 2023-10-11T07:59:18 | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: text
dtype: string
splits:
- name: train
num_bytes: 16610581.0
num_examples: 108
download_size: 15605780
dataset_size: 16610581.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "test001"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 474 | [
[
-0.0455322265625,
-0.01418304443359375,
0.005336761474609375,
0.023712158203125,
-0.005908966064453125,
-0.00787353515625,
0.029876708984375,
0.0002503395080566406,
0.05853271484375,
0.0249481201171875,
-0.062744140625,
-0.043243408203125,
-0.035186767578125,
... |
Ahmed107/jordan_audio | 2023-10-11T08:13:00.000Z | [
"region:us"
] | Ahmed107 | null | null | 0 | 0 | 2023-10-11T08:13:00 | Entry not found | 15 | [
[
-0.021392822265625,
-0.01494598388671875,
0.05718994140625,
0.028839111328125,
-0.0350341796875,
0.046539306640625,
0.052490234375,
0.00507354736328125,
0.051361083984375,
0.01702880859375,
-0.052093505859375,
-0.01494598388671875,
-0.06036376953125,
0.03790... |
destitech/test_ta | 2023-10-11T08:28:04.000Z | [
"region:us"
] | destitech | null | null | 0 | 0 | 2023-10-11T08:26:44 | Entry not found | 15 | [
[
-0.021392822265625,
-0.01494598388671875,
0.05718994140625,
0.028839111328125,
-0.0350341796875,
0.046539306640625,
0.052490234375,
0.00507354736328125,
0.051361083984375,
0.01702880859375,
-0.052093505859375,
-0.01494598388671875,
-0.06036376953125,
0.03790... |
open-llm-leaderboard/details_crumb__gpt2023 | 2023-10-24T11:34:00.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-11T08:31:08 | ---
pretty_name: Evaluation run of crumb/gpt2023
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [crumb/gpt2023](https://huggingface.co/crumb/gpt2023) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_crumb__gpt2023\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T11:33:48.204905](https://huggingface.co/datasets/open-llm-leaderboard/details_crumb__gpt2023/blob/main/results_2023-10-24T11-33-48.204905.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0012583892617449664,\n\
\ \"em_stderr\": 0.0003630560893119132,\n \"f1\": 0.04730285234899332,\n\
\ \"f1_stderr\": 0.0013435226639105919,\n \"acc\": 0.25210824971442214,\n\
\ \"acc_stderr\": 0.007783509925876781\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.0003630560893119132,\n\
\ \"f1\": 0.04730285234899332,\n \"f1_stderr\": 0.0013435226639105919\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.003032600454890068,\n \
\ \"acc_stderr\": 0.0015145735612245494\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5011838989739542,\n \"acc_stderr\": 0.014052446290529012\n\
\ }\n}\n```"
repo_url: https://huggingface.co/crumb/gpt2023
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|arc:challenge|25_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T11_33_48.204905
path:
- '**/details_harness|drop|3_2023-10-24T11-33-48.204905.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T11-33-48.204905.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T11_33_48.204905
path:
- '**/details_harness|gsm8k|5_2023-10-24T11-33-48.204905.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T11-33-48.204905.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hellaswag|10_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T08-30-54.655929.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T08-30-54.655929.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T08-30-54.655929.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T11_33_48.204905
path:
- '**/details_harness|winogrande|5_2023-10-24T11-33-48.204905.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T11-33-48.204905.parquet'
- config_name: results
data_files:
- split: 2023_10_11T08_30_54.655929
path:
- results_2023-10-11T08-30-54.655929.parquet
- split: 2023_10_24T11_33_48.204905
path:
- results_2023-10-24T11-33-48.204905.parquet
- split: latest
path:
- results_2023-10-24T11-33-48.204905.parquet
---
# Dataset Card for Evaluation run of crumb/gpt2023
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/crumb/gpt2023
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [crumb/gpt2023](https://huggingface.co/crumb/gpt2023) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_crumb__gpt2023",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T11:33:48.204905](https://huggingface.co/datasets/open-llm-leaderboard/details_crumb__gpt2023/blob/main/results_2023-10-24T11-33-48.204905.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0012583892617449664,
"em_stderr": 0.0003630560893119132,
"f1": 0.04730285234899332,
"f1_stderr": 0.0013435226639105919,
"acc": 0.25210824971442214,
"acc_stderr": 0.007783509925876781
},
"harness|drop|3": {
"em": 0.0012583892617449664,
"em_stderr": 0.0003630560893119132,
"f1": 0.04730285234899332,
"f1_stderr": 0.0013435226639105919
},
"harness|gsm8k|5": {
"acc": 0.003032600454890068,
"acc_stderr": 0.0015145735612245494
},
"harness|winogrande|5": {
"acc": 0.5011838989739542,
"acc_stderr": 0.014052446290529012
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,485 | [
[
-0.0240020751953125,
-0.04632568359375,
0.0204620361328125,
0.0102386474609375,
-0.00662994384765625,
0.01043701171875,
-0.0361328125,
-0.01416778564453125,
0.02203369140625,
0.03570556640625,
-0.04669189453125,
-0.07073974609375,
-0.057525634765625,
0.00759... |
distil-whisper/whisper_transcriptions_token_ids | 2023-10-11T17:03:40.000Z | [
"region:us"
] | distil-whisper | null | null | 0 | 0 | 2023-10-11T08:48:17 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
open-llm-leaderboard/details_SkunkworksAI__Mistralic-7B-1 | 2023-10-28T13:22:32.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-11T09:21:44 | ---
pretty_name: Evaluation run of SkunkworksAI/Mistralic-7B-1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [SkunkworksAI/Mistralic-7B-1](https://huggingface.co/SkunkworksAI/Mistralic-7B-1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SkunkworksAI__Mistralic-7B-1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-28T13:22:20.115560](https://huggingface.co/datasets/open-llm-leaderboard/details_SkunkworksAI__Mistralic-7B-1/blob/main/results_2023-10-28T13-22-20.115560.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.3366191275167785,\n\
\ \"em_stderr\": 0.004839388843031059,\n \"f1\": 0.43708682885906275,\n\
\ \"f1_stderr\": 0.004627060310059935,\n \"acc\": 0.44050675782818416,\n\
\ \"acc_stderr\": 0.010231909076615354\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.3366191275167785,\n \"em_stderr\": 0.004839388843031059,\n\
\ \"f1\": 0.43708682885906275,\n \"f1_stderr\": 0.004627060310059935\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1106899166034875,\n \
\ \"acc_stderr\": 0.008642172551392479\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7703235990528808,\n \"acc_stderr\": 0.011821645601838227\n\
\ }\n}\n```"
repo_url: https://huggingface.co/SkunkworksAI/Mistralic-7B-1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|arc:challenge|25_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_28T13_22_20.115560
path:
- '**/details_harness|drop|3_2023-10-28T13-22-20.115560.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-28T13-22-20.115560.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_28T13_22_20.115560
path:
- '**/details_harness|gsm8k|5_2023-10-28T13-22-20.115560.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-28T13-22-20.115560.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hellaswag|10_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T09-21-21.065888.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T09-21-21.065888.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T09-21-21.065888.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_28T13_22_20.115560
path:
- '**/details_harness|winogrande|5_2023-10-28T13-22-20.115560.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-28T13-22-20.115560.parquet'
- config_name: results
data_files:
- split: 2023_10_11T09_21_21.065888
path:
- results_2023-10-11T09-21-21.065888.parquet
- split: 2023_10_28T13_22_20.115560
path:
- results_2023-10-28T13-22-20.115560.parquet
- split: latest
path:
- results_2023-10-28T13-22-20.115560.parquet
---
# Dataset Card for Evaluation run of SkunkworksAI/Mistralic-7B-1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/SkunkworksAI/Mistralic-7B-1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [SkunkworksAI/Mistralic-7B-1](https://huggingface.co/SkunkworksAI/Mistralic-7B-1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_SkunkworksAI__Mistralic-7B-1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-28T13:22:20.115560](https://huggingface.co/datasets/open-llm-leaderboard/details_SkunkworksAI__Mistralic-7B-1/blob/main/results_2023-10-28T13-22-20.115560.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.3366191275167785,
"em_stderr": 0.004839388843031059,
"f1": 0.43708682885906275,
"f1_stderr": 0.004627060310059935,
"acc": 0.44050675782818416,
"acc_stderr": 0.010231909076615354
},
"harness|drop|3": {
"em": 0.3366191275167785,
"em_stderr": 0.004839388843031059,
"f1": 0.43708682885906275,
"f1_stderr": 0.004627060310059935
},
"harness|gsm8k|5": {
"acc": 0.1106899166034875,
"acc_stderr": 0.008642172551392479
},
"harness|winogrande|5": {
"acc": 0.7703235990528808,
"acc_stderr": 0.011821645601838227
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,632 | [
[
-0.02593994140625,
-0.04052734375,
0.017791748046875,
0.022247314453125,
-0.015167236328125,
0.004688262939453125,
-0.028839111328125,
-0.0141754150390625,
0.0270233154296875,
0.0361328125,
-0.052459716796875,
-0.06927490234375,
-0.050323486328125,
0.0115890... |
open-llm-leaderboard/details_OpenBuddy__openbuddy-falcon-180b-v13-preview0 | 2023-10-24T12:56:30.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-11T09:27:26 | ---
pretty_name: Evaluation run of OpenBuddy/openbuddy-falcon-180b-v13-preview0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [OpenBuddy/openbuddy-falcon-180b-v13-preview0](https://huggingface.co/OpenBuddy/openbuddy-falcon-180b-v13-preview0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenBuddy__openbuddy-falcon-180b-v13-preview0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T12:56:17.890074](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-falcon-180b-v13-preview0/blob/main/results_2023-10-24T12-56-17.890074.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.490876677852349,\n\
\ \"em_stderr\": 0.005119615515857085,\n \"f1\": 0.5498133389261767,\n\
\ \"f1_stderr\": 0.004838031306299291,\n \"acc\": 0.6212929481268546,\n\
\ \"acc_stderr\": 0.01211195240749183\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.490876677852349,\n \"em_stderr\": 0.005119615515857085,\n\
\ \"f1\": 0.5498133389261767,\n \"f1_stderr\": 0.004838031306299291\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4162244124336619,\n \
\ \"acc_stderr\": 0.013577788334652662\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8263614838200474,\n \"acc_stderr\": 0.010646116480331\n\
\ }\n}\n```"
repo_url: https://huggingface.co/OpenBuddy/openbuddy-falcon-180b-v13-preview0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|arc:challenge|25_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|arc:challenge|25_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T12_56_17.890074
path:
- '**/details_harness|drop|3_2023-10-24T12-56-17.890074.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T12-56-17.890074.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T12_56_17.890074
path:
- '**/details_harness|gsm8k|5_2023-10-24T12-56-17.890074.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T12-56-17.890074.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hellaswag|10_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hellaswag|10_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T12_56_17.890074
path:
- '**/details_harness|winogrande|5_2023-10-24T12-56-17.890074.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T12-56-17.890074.parquet'
- config_name: results
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- results_2023-10-11T09-27-08.727010.parquet
- split: 2023_10_11T10_53_08.711708
path:
- results_2023-10-11T10-53-08.711708.parquet
- split: 2023_10_24T12_56_17.890074
path:
- results_2023-10-24T12-56-17.890074.parquet
- split: latest
path:
- results_2023-10-24T12-56-17.890074.parquet
---
# Dataset Card for Evaluation run of OpenBuddy/openbuddy-falcon-180b-v13-preview0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/OpenBuddy/openbuddy-falcon-180b-v13-preview0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [OpenBuddy/openbuddy-falcon-180b-v13-preview0](https://huggingface.co/OpenBuddy/openbuddy-falcon-180b-v13-preview0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OpenBuddy__openbuddy-falcon-180b-v13-preview0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T12:56:17.890074](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-falcon-180b-v13-preview0/blob/main/results_2023-10-24T12-56-17.890074.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.490876677852349,
"em_stderr": 0.005119615515857085,
"f1": 0.5498133389261767,
"f1_stderr": 0.004838031306299291,
"acc": 0.6212929481268546,
"acc_stderr": 0.01211195240749183
},
"harness|drop|3": {
"em": 0.490876677852349,
"em_stderr": 0.005119615515857085,
"f1": 0.5498133389261767,
"f1_stderr": 0.004838031306299291
},
"harness|gsm8k|5": {
"acc": 0.4162244124336619,
"acc_stderr": 0.013577788334652662
},
"harness|winogrande|5": {
"acc": 0.8263614838200474,
"acc_stderr": 0.010646116480331
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 53,045 | [
[
-0.03570556640625,
-0.0565185546875,
0.009796142578125,
0.01593017578125,
-0.002452850341796875,
0.00922393798828125,
-0.0248870849609375,
-0.01055908203125,
0.0330810546875,
0.037628173828125,
-0.048553466796875,
-0.06597900390625,
-0.044189453125,
0.010795... |
koshkidadanet/alpha | 2023-10-11T09:33:06.000Z | [
"region:us"
] | koshkidadanet | null | null | 0 | 0 | 2023-10-11T09:33:06 | Entry not found | 15 | [
[
-0.02142333984375,
-0.01495361328125,
0.05718994140625,
0.0288238525390625,
-0.035064697265625,
0.046539306640625,
0.052520751953125,
0.005062103271484375,
0.0513916015625,
0.016998291015625,
-0.052093505859375,
-0.014984130859375,
-0.060394287109375,
0.0379... |
yavasde/permutated-wikitext | 2023-10-11T09:41:39.000Z | [
"region:us"
] | yavasde | null | null | 0 | 0 | 2023-10-11T09:41:33 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 11118443
num_examples: 98407
- name: test
num_bytes: 1312320
num_examples: 11960
- name: valid
num_bytes: 1165858
num_examples: 10360
download_size: 8428865
dataset_size: 13596621
---
# Dataset Card for "permutated-wikitext"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 489 | [
[
-0.03765869140625,
-0.002094268798828125,
0.0036449432373046875,
0.03460693359375,
-0.0135040283203125,
0.0200958251953125,
0.0347900390625,
-0.00922393798828125,
0.061676025390625,
0.0465087890625,
-0.07305908203125,
-0.04327392578125,
-0.033935546875,
-0.0... |
open-llm-leaderboard/details_OpenBuddy__openbuddy-mistral-7b-v13-base | 2023-10-23T13:40:19.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-11T09:56:43 | ---
pretty_name: Evaluation run of OpenBuddy/openbuddy-mistral-7b-v13-base
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [OpenBuddy/openbuddy-mistral-7b-v13-base](https://huggingface.co/OpenBuddy/openbuddy-mistral-7b-v13-base)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenBuddy__openbuddy-mistral-7b-v13-base\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-23T13:40:07.826401](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-mistral-7b-v13-base/blob/main/results_2023-10-23T13-40-07.826401.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.28544463087248323,\n\
\ \"em_stderr\": 0.004625072383719666,\n \"f1\": 0.3571770134228197,\n\
\ \"f1_stderr\": 0.004531759792948092,\n \"acc\": 0.3628134250613192,\n\
\ \"acc_stderr\": 0.007861162191425665\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.28544463087248323,\n \"em_stderr\": 0.004625072383719666,\n\
\ \"f1\": 0.3571770134228197,\n \"f1_stderr\": 0.004531759792948092\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.012130401819560273,\n \
\ \"acc_stderr\": 0.003015294242890952\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7134964483030781,\n \"acc_stderr\": 0.01270703013996038\n\
\ }\n}\n```"
repo_url: https://huggingface.co/OpenBuddy/openbuddy-mistral-7b-v13-base
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|arc:challenge|25_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_23T13_40_07.826401
path:
- '**/details_harness|drop|3_2023-10-23T13-40-07.826401.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-23T13-40-07.826401.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_23T13_40_07.826401
path:
- '**/details_harness|gsm8k|5_2023-10-23T13-40-07.826401.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-23T13-40-07.826401.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hellaswag|10_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T09-56-20.350161.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T09-56-20.350161.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T09-56-20.350161.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_23T13_40_07.826401
path:
- '**/details_harness|winogrande|5_2023-10-23T13-40-07.826401.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-23T13-40-07.826401.parquet'
- config_name: results
data_files:
- split: 2023_10_11T09_56_20.350161
path:
- results_2023-10-11T09-56-20.350161.parquet
- split: 2023_10_23T13_40_07.826401
path:
- results_2023-10-23T13-40-07.826401.parquet
- split: latest
path:
- results_2023-10-23T13-40-07.826401.parquet
---
# Dataset Card for Evaluation run of OpenBuddy/openbuddy-mistral-7b-v13-base
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/OpenBuddy/openbuddy-mistral-7b-v13-base
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [OpenBuddy/openbuddy-mistral-7b-v13-base](https://huggingface.co/OpenBuddy/openbuddy-mistral-7b-v13-base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OpenBuddy__openbuddy-mistral-7b-v13-base",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-23T13:40:07.826401](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-mistral-7b-v13-base/blob/main/results_2023-10-23T13-40-07.826401.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.28544463087248323,
"em_stderr": 0.004625072383719666,
"f1": 0.3571770134228197,
"f1_stderr": 0.004531759792948092,
"acc": 0.3628134250613192,
"acc_stderr": 0.007861162191425665
},
"harness|drop|3": {
"em": 0.28544463087248323,
"em_stderr": 0.004625072383719666,
"f1": 0.3571770134228197,
"f1_stderr": 0.004531759792948092
},
"harness|gsm8k|5": {
"acc": 0.012130401819560273,
"acc_stderr": 0.003015294242890952
},
"harness|winogrande|5": {
"acc": 0.7134964483030781,
"acc_stderr": 0.01270703013996038
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,776 | [
[
-0.02923583984375,
-0.052764892578125,
0.00835418701171875,
0.01702880859375,
-0.00830078125,
-0.0018682479858398438,
-0.0269775390625,
-0.0109100341796875,
0.0191650390625,
0.03753662109375,
-0.039093017578125,
-0.066650390625,
-0.0447998046875,
0.006118774... |
DataOceanAI/Off_the_self_dataset | 2023-10-11T10:31:30.000Z | [
"task_categories:conversational",
"task_categories:text-generation",
"license:unknown",
"datasets",
"dataoceanai",
"speechocean",
"ASR",
"TTS",
"region:us"
] | DataOceanAI | null | null | 0 | 0 | 2023-10-11T10:05:01 | ---
license: unknown
task_categories:
- conversational
- text-generation
tags:
- datasets
- dataoceanai
- speechocean
- ASR
- TTS
pretty_name: DataOcean AI - Off the self datasets
---
# Introduction
<!-- Provide a quick summary of the dataset. -->
DataOcean AI (SHA stock code: 688787), founded in 2005, is one of the earliest AI training data solution providers in China.
As the first listed enterprise in AI training data domestically, DataOcean AI is committed to providing AI datasets and services for AI enterprises and R&D institutions.
DataOcean AI specializes in delivering comprehensive, multilingual, cross-domain, and multimodal AI datasets, along with a range of data-related services. Our offerings include data annotation, data collection, data design, and modal evaluation, catering to the diverse needs of enterprises across various industries. Our services encompass essential domains such as smart voice (including voice recognition and voice synthesis), computer vision, and natural language processing, spanning a wide array of approximately 200 primary languages and dialects from around the globe.
DataOcean AI has been actively involved in the industry for nearly two decades and has developed close to 700 deep partnerships with leading IT companies, academic institutions, and emerging AI enterprises. It has delivered thousands of customized projects successfully and gained the deep trust of customers by focusing on competent, dependable, and safe data services. The company’s superior resources which cover 190+ languages and dialects in more than 70 countries, as well as its technologically leading algorithm R&D team and well-experienced project teams, are valuable assets of the company that contribute to the overall successful implementation of frontier AI projects around the world.
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [DATAOCEAN AI](https://en.dataoceanai.com/)
- **License:** Commercial
Check out the [files](https://huggingface.co/datasets/DataOceanAI/Off_the_self_dataset/tree/main) or visit our website for details
## Contact
You can alwasy contact us via email "contact@dataoceanai.com" or fill up the [contact form](https://en.dataoceanai.com/?m=index&c=dsvoice&a=consult&aboutus_id=9619) in our website ' https://en.dataoceanai.com/ '
<!-- Address questions around how the dataset is intended to be used. -->
| 2,435 | [
[
-0.018951416015625,
-0.0289154052734375,
-0.00946807861328125,
0.0181732177734375,
-0.01654052734375,
0.0170745849609375,
-0.00751495361328125,
-0.03253173828125,
0.01401519775390625,
0.0399169921875,
-0.0631103515625,
-0.04742431640625,
-0.01617431640625,
-... |
open-llm-leaderboard/details_Weyaxi__SlimOpenOrca-Mistral-7B | 2023-10-24T00:40:48.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | 0 | 0 | 2023-10-11T10:05:06 | ---
pretty_name: Evaluation run of Weyaxi/SlimOpenOrca-Mistral-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Weyaxi/SlimOpenOrca-Mistral-7B](https://huggingface.co/Weyaxi/SlimOpenOrca-Mistral-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__SlimOpenOrca-Mistral-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T00:40:26.410334](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__SlimOpenOrca-Mistral-7B/blob/main/results_2023-10-24T00-40-26.410334.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.004404362416107382,\n\
\ \"em_stderr\": 0.0006781451620479603,\n \"f1\": 0.0900964765100671,\n\
\ \"f1_stderr\": 0.001791740655538585,\n \"acc\": 0.494413205574767,\n\
\ \"acc_stderr\": 0.011528615182477716\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.004404362416107382,\n \"em_stderr\": 0.0006781451620479603,\n\
\ \"f1\": 0.0900964765100671,\n \"f1_stderr\": 0.001791740655538585\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.21455648218347234,\n \
\ \"acc_stderr\": 0.011307604104052887\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7742699289660616,\n \"acc_stderr\": 0.011749626260902547\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Weyaxi/SlimOpenOrca-Mistral-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|arc:challenge|25_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T00_40_26.410334
path:
- '**/details_harness|drop|3_2023-10-24T00-40-26.410334.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T00-40-26.410334.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T00_40_26.410334
path:
- '**/details_harness|gsm8k|5_2023-10-24T00-40-26.410334.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T00-40-26.410334.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hellaswag|10_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T10-04-43.187576.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T10-04-43.187576.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T10-04-43.187576.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T00_40_26.410334
path:
- '**/details_harness|winogrande|5_2023-10-24T00-40-26.410334.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T00-40-26.410334.parquet'
- config_name: results
data_files:
- split: 2023_10_11T10_04_43.187576
path:
- results_2023-10-11T10-04-43.187576.parquet
- split: 2023_10_24T00_40_26.410334
path:
- results_2023-10-24T00-40-26.410334.parquet
- split: latest
path:
- results_2023-10-24T00-40-26.410334.parquet
---
# Dataset Card for Evaluation run of Weyaxi/SlimOpenOrca-Mistral-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Weyaxi/SlimOpenOrca-Mistral-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Weyaxi/SlimOpenOrca-Mistral-7B](https://huggingface.co/Weyaxi/SlimOpenOrca-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__SlimOpenOrca-Mistral-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T00:40:26.410334](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__SlimOpenOrca-Mistral-7B/blob/main/results_2023-10-24T00-40-26.410334.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.004404362416107382,
"em_stderr": 0.0006781451620479603,
"f1": 0.0900964765100671,
"f1_stderr": 0.001791740655538585,
"acc": 0.494413205574767,
"acc_stderr": 0.011528615182477716
},
"harness|drop|3": {
"em": 0.004404362416107382,
"em_stderr": 0.0006781451620479603,
"f1": 0.0900964765100671,
"f1_stderr": 0.001791740655538585
},
"harness|gsm8k|5": {
"acc": 0.21455648218347234,
"acc_stderr": 0.011307604104052887
},
"harness|winogrande|5": {
"acc": 0.7742699289660616,
"acc_stderr": 0.011749626260902547
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | 38,674 | [
[
-0.0296783447265625,
-0.044952392578125,
0.01499176025390625,
0.017120361328125,
-0.0112457275390625,
0.0032405853271484375,
-0.0288238525390625,
-0.011993408203125,
0.026092529296875,
0.041748046875,
-0.049957275390625,
-0.0706787109375,
-0.049713134765625,
... |
tinhpx2911/book_data_processed | 2023-10-11T14:53:56.000Z | [
"region:us"
] | tinhpx2911 | null | null | 0 | 0 | 2023-10-11T10:05:13 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 7993647281
num_examples: 14492
download_size: 3225112197
dataset_size: 7993647281
---
# Dataset Card for "book_data_processed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 457 | [
[
-0.0266571044921875,
-0.0215301513671875,
0.003742218017578125,
0.00020503997802734375,
-0.01428985595703125,
-0.0092926025390625,
0.01369476318359375,
-0.01103973388671875,
0.0384521484375,
0.05224609375,
-0.061614990234375,
-0.065185546875,
-0.0355224609375,
... |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.