id stringlengths 2 115 | lastModified stringlengths 24 24 | tags list | author stringlengths 2 42 ⌀ | description stringlengths 0 68.7k ⌀ | citation stringlengths 0 10.7k ⌀ | cardData null | likes int64 0 3.55k | downloads int64 0 10.1M | card stringlengths 0 1.01M |
|---|---|---|---|---|---|---|---|---|---|
tinhpx2911/vietnamese_book_10k | 2023-10-10T10:19:35.000Z | [
"region:us"
] | tinhpx2911 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
- name: name
dtype: string
splits:
- name: train
num_bytes: 1607495469
num_examples: 9961
download_size: 844824154
dataset_size: 1607495469
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "10kvnbook"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ismailiismail/French_English_2 | 2023-10-10T10:18:28.000Z | [
"region:us"
] | ismailiismail | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: phrase
dtype: string
- name: paraphrase
dtype: string
splits:
- name: train
num_bytes: 619554
num_examples: 1997
download_size: 266052
dataset_size: 619554
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "French_English_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zhangshuoming/c_x86_exebench_json | 2023-10-10T12:48:05.000Z | [
"region:us"
] | zhangshuoming | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1906085908
num_examples: 868385
download_size: 514911495
dataset_size: 1906085908
---
# Dataset Card for "c_x86_exebench_json"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r8-q_k_v_o_gate_up_down | 2023-10-10T10:22:06.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-q_k_v_o_gate_up_down
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-q_k_v_o_gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-q_k_v_o_gate_up_down)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r8-q_k_v_o_gate_up_down\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T10:20:42.158103](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r8-q_k_v_o_gate_up_down/blob/main/results_2023-10-10T10-20-42.158103.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5396426695343253,\n\
\ \"acc_stderr\": 0.03470115682226027,\n \"acc_norm\": 0.5439904411720377,\n\
\ \"acc_norm_stderr\": 0.03468261597672341,\n \"mc1\": 0.2839657282741738,\n\
\ \"mc1_stderr\": 0.015785370858396725,\n \"mc2\": 0.41889365391647926,\n\
\ \"mc2_stderr\": 0.014206984898193394\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.507679180887372,\n \"acc_stderr\": 0.014609667440892574,\n\
\ \"acc_norm\": 0.5571672354948806,\n \"acc_norm_stderr\": 0.014515573873348895\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6084445329615614,\n\
\ \"acc_stderr\": 0.00487100593940747,\n \"acc_norm\": 0.8154750049790879,\n\
\ \"acc_norm_stderr\": 0.0038711896202760668\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4222222222222222,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.4222222222222222,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5921052631578947,\n \"acc_stderr\": 0.039993097127774734,\n\
\ \"acc_norm\": 0.5921052631578947,\n \"acc_norm_stderr\": 0.039993097127774734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5886792452830188,\n \"acc_stderr\": 0.030285009259009794,\n\
\ \"acc_norm\": 0.5886792452830188,\n \"acc_norm_stderr\": 0.030285009259009794\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5138888888888888,\n\
\ \"acc_stderr\": 0.04179596617581,\n \"acc_norm\": 0.5138888888888888,\n\
\ \"acc_norm_stderr\": 0.04179596617581\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n\
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5144508670520231,\n\
\ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.5144508670520231,\n\
\ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n\
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4340425531914894,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.4340425531914894,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.043727482902780064,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.043727482902780064\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.32275132275132273,\n \"acc_stderr\": 0.024078943243597016,\n \"\
acc_norm\": 0.32275132275132273,\n \"acc_norm_stderr\": 0.024078943243597016\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.0404061017820884,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.0404061017820884\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6483870967741936,\n\
\ \"acc_stderr\": 0.027162537826948458,\n \"acc_norm\": 0.6483870967741936,\n\
\ \"acc_norm_stderr\": 0.027162537826948458\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.43349753694581283,\n \"acc_stderr\": 0.034867317274198714,\n\
\ \"acc_norm\": 0.43349753694581283,\n \"acc_norm_stderr\": 0.034867317274198714\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.696969696969697,\n \"acc_stderr\": 0.035886248000917075,\n\
\ \"acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.035886248000917075\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7070707070707071,\n \"acc_stderr\": 0.03242497958178815,\n \"\
acc_norm\": 0.7070707070707071,\n \"acc_norm_stderr\": 0.03242497958178815\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.02840895362624528,\n\
\ \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.02840895362624528\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5205128205128206,\n \"acc_stderr\": 0.02532966316348994,\n \
\ \"acc_norm\": 0.5205128205128206,\n \"acc_norm_stderr\": 0.02532966316348994\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683515,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683515\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.032145368597886394,\n\
\ \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.032145368597886394\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7467889908256881,\n \"acc_stderr\": 0.01864407304137504,\n \"\
acc_norm\": 0.7467889908256881,\n \"acc_norm_stderr\": 0.01864407304137504\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.03404705328653879,\n \"\
acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.03404705328653879\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7450980392156863,\n \"acc_stderr\": 0.030587591351604246,\n \"\
acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.030587591351604246\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7130801687763713,\n \"acc_stderr\": 0.029443773022594693,\n \
\ \"acc_norm\": 0.7130801687763713,\n \"acc_norm_stderr\": 0.029443773022594693\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5829596412556054,\n\
\ \"acc_stderr\": 0.03309266936071722,\n \"acc_norm\": 0.5829596412556054,\n\
\ \"acc_norm_stderr\": 0.03309266936071722\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5572519083969466,\n \"acc_stderr\": 0.043564472026650695,\n\
\ \"acc_norm\": 0.5572519083969466,\n \"acc_norm_stderr\": 0.043564472026650695\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7024793388429752,\n \"acc_stderr\": 0.04173349148083499,\n \"\
acc_norm\": 0.7024793388429752,\n \"acc_norm_stderr\": 0.04173349148083499\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6073619631901841,\n \"acc_stderr\": 0.03836740907831029,\n\
\ \"acc_norm\": 0.6073619631901841,\n \"acc_norm_stderr\": 0.03836740907831029\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285712,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285712\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n\
\ \"acc_stderr\": 0.025140935950335435,\n \"acc_norm\": 0.8205128205128205,\n\
\ \"acc_norm_stderr\": 0.025140935950335435\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488583\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7522349936143039,\n\
\ \"acc_stderr\": 0.01543808308056897,\n \"acc_norm\": 0.7522349936143039,\n\
\ \"acc_norm_stderr\": 0.01543808308056897\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6069364161849711,\n \"acc_stderr\": 0.02629622791561367,\n\
\ \"acc_norm\": 0.6069364161849711,\n \"acc_norm_stderr\": 0.02629622791561367\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.34972067039106147,\n\
\ \"acc_stderr\": 0.015949308790233645,\n \"acc_norm\": 0.34972067039106147,\n\
\ \"acc_norm_stderr\": 0.015949308790233645\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.565359477124183,\n \"acc_stderr\": 0.02838425670488304,\n\
\ \"acc_norm\": 0.565359477124183,\n \"acc_norm_stderr\": 0.02838425670488304\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6270096463022508,\n\
\ \"acc_stderr\": 0.0274666102131401,\n \"acc_norm\": 0.6270096463022508,\n\
\ \"acc_norm_stderr\": 0.0274666102131401\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5833333333333334,\n \"acc_stderr\": 0.027431623722415005,\n\
\ \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.027431623722415005\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4397163120567376,\n \"acc_stderr\": 0.029609912075594106,\n \
\ \"acc_norm\": 0.4397163120567376,\n \"acc_norm_stderr\": 0.029609912075594106\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43546284224250326,\n\
\ \"acc_stderr\": 0.012663412101248338,\n \"acc_norm\": 0.43546284224250326,\n\
\ \"acc_norm_stderr\": 0.012663412101248338\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5477941176470589,\n \"acc_stderr\": 0.03023375855159644,\n\
\ \"acc_norm\": 0.5477941176470589,\n \"acc_norm_stderr\": 0.03023375855159644\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5637254901960784,\n \"acc_stderr\": 0.02006287424353913,\n \
\ \"acc_norm\": 0.5637254901960784,\n \"acc_norm_stderr\": 0.02006287424353913\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5428571428571428,\n \"acc_stderr\": 0.031891418324213966,\n\
\ \"acc_norm\": 0.5428571428571428,\n \"acc_norm_stderr\": 0.031891418324213966\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6865671641791045,\n\
\ \"acc_stderr\": 0.032801882053486435,\n \"acc_norm\": 0.6865671641791045,\n\
\ \"acc_norm_stderr\": 0.032801882053486435\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3855421686746988,\n\
\ \"acc_stderr\": 0.03789134424611551,\n \"acc_norm\": 0.3855421686746988,\n\
\ \"acc_norm_stderr\": 0.03789134424611551\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7485380116959064,\n \"acc_stderr\": 0.033275044238468436,\n\
\ \"acc_norm\": 0.7485380116959064,\n \"acc_norm_stderr\": 0.033275044238468436\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2839657282741738,\n\
\ \"mc1_stderr\": 0.015785370858396725,\n \"mc2\": 0.41889365391647926,\n\
\ \"mc2_stderr\": 0.014206984898193394\n }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-q_k_v_o_gate_up_down
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|arc:challenge|25_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hellaswag|10_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T10-20-42.158103.parquet'
- config_name: results
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- results_2023-10-10T10-20-42.158103.parquet
- split: latest
path:
- results_2023-10-10T10-20-42.158103.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-q_k_v_o_gate_up_down
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-q_k_v_o_gate_up_down
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-q_k_v_o_gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-q_k_v_o_gate_up_down) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r8-q_k_v_o_gate_up_down",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T10:20:42.158103](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r8-q_k_v_o_gate_up_down/blob/main/results_2023-10-10T10-20-42.158103.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5396426695343253,
"acc_stderr": 0.03470115682226027,
"acc_norm": 0.5439904411720377,
"acc_norm_stderr": 0.03468261597672341,
"mc1": 0.2839657282741738,
"mc1_stderr": 0.015785370858396725,
"mc2": 0.41889365391647926,
"mc2_stderr": 0.014206984898193394
},
"harness|arc:challenge|25": {
"acc": 0.507679180887372,
"acc_stderr": 0.014609667440892574,
"acc_norm": 0.5571672354948806,
"acc_norm_stderr": 0.014515573873348895
},
"harness|hellaswag|10": {
"acc": 0.6084445329615614,
"acc_stderr": 0.00487100593940747,
"acc_norm": 0.8154750049790879,
"acc_norm_stderr": 0.0038711896202760668
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4222222222222222,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.4222222222222222,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5921052631578947,
"acc_stderr": 0.039993097127774734,
"acc_norm": 0.5921052631578947,
"acc_norm_stderr": 0.039993097127774734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5886792452830188,
"acc_stderr": 0.030285009259009794,
"acc_norm": 0.5886792452830188,
"acc_norm_stderr": 0.030285009259009794
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.04179596617581,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.04179596617581
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4340425531914894,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.4340425531914894,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.043727482902780064,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.043727482902780064
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.45517241379310347,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.45517241379310347,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.32275132275132273,
"acc_stderr": 0.024078943243597016,
"acc_norm": 0.32275132275132273,
"acc_norm_stderr": 0.024078943243597016
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0404061017820884,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0404061017820884
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6483870967741936,
"acc_stderr": 0.027162537826948458,
"acc_norm": 0.6483870967741936,
"acc_norm_stderr": 0.027162537826948458
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43349753694581283,
"acc_stderr": 0.034867317274198714,
"acc_norm": 0.43349753694581283,
"acc_norm_stderr": 0.034867317274198714
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.035886248000917075,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.035886248000917075
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7070707070707071,
"acc_stderr": 0.03242497958178815,
"acc_norm": 0.7070707070707071,
"acc_norm_stderr": 0.03242497958178815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8082901554404145,
"acc_stderr": 0.02840895362624528,
"acc_norm": 0.8082901554404145,
"acc_norm_stderr": 0.02840895362624528
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5205128205128206,
"acc_stderr": 0.02532966316348994,
"acc_norm": 0.5205128205128206,
"acc_norm_stderr": 0.02532966316348994
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683515,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683515
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.032145368597886394,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.032145368597886394
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7467889908256881,
"acc_stderr": 0.01864407304137504,
"acc_norm": 0.7467889908256881,
"acc_norm_stderr": 0.01864407304137504
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.03404705328653879,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.03404705328653879
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7130801687763713,
"acc_stderr": 0.029443773022594693,
"acc_norm": 0.7130801687763713,
"acc_norm_stderr": 0.029443773022594693
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5829596412556054,
"acc_stderr": 0.03309266936071722,
"acc_norm": 0.5829596412556054,
"acc_norm_stderr": 0.03309266936071722
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5572519083969466,
"acc_stderr": 0.043564472026650695,
"acc_norm": 0.5572519083969466,
"acc_norm_stderr": 0.043564472026650695
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7024793388429752,
"acc_stderr": 0.04173349148083499,
"acc_norm": 0.7024793388429752,
"acc_norm_stderr": 0.04173349148083499
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6073619631901841,
"acc_stderr": 0.03836740907831029,
"acc_norm": 0.6073619631901841,
"acc_norm_stderr": 0.03836740907831029
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285712,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285712
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.025140935950335435,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.025140935950335435
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7522349936143039,
"acc_stderr": 0.01543808308056897,
"acc_norm": 0.7522349936143039,
"acc_norm_stderr": 0.01543808308056897
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.02629622791561367,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.02629622791561367
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.34972067039106147,
"acc_stderr": 0.015949308790233645,
"acc_norm": 0.34972067039106147,
"acc_norm_stderr": 0.015949308790233645
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.565359477124183,
"acc_stderr": 0.02838425670488304,
"acc_norm": 0.565359477124183,
"acc_norm_stderr": 0.02838425670488304
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6270096463022508,
"acc_stderr": 0.0274666102131401,
"acc_norm": 0.6270096463022508,
"acc_norm_stderr": 0.0274666102131401
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.027431623722415005,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.027431623722415005
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4397163120567376,
"acc_stderr": 0.029609912075594106,
"acc_norm": 0.4397163120567376,
"acc_norm_stderr": 0.029609912075594106
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43546284224250326,
"acc_stderr": 0.012663412101248338,
"acc_norm": 0.43546284224250326,
"acc_norm_stderr": 0.012663412101248338
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5477941176470589,
"acc_stderr": 0.03023375855159644,
"acc_norm": 0.5477941176470589,
"acc_norm_stderr": 0.03023375855159644
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5637254901960784,
"acc_stderr": 0.02006287424353913,
"acc_norm": 0.5637254901960784,
"acc_norm_stderr": 0.02006287424353913
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5428571428571428,
"acc_stderr": 0.031891418324213966,
"acc_norm": 0.5428571428571428,
"acc_norm_stderr": 0.031891418324213966
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6865671641791045,
"acc_stderr": 0.032801882053486435,
"acc_norm": 0.6865671641791045,
"acc_norm_stderr": 0.032801882053486435
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3855421686746988,
"acc_stderr": 0.03789134424611551,
"acc_norm": 0.3855421686746988,
"acc_norm_stderr": 0.03789134424611551
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7485380116959064,
"acc_stderr": 0.033275044238468436,
"acc_norm": 0.7485380116959064,
"acc_norm_stderr": 0.033275044238468436
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2839657282741738,
"mc1_stderr": 0.015785370858396725,
"mc2": 0.41889365391647926,
"mc2_stderr": 0.014206984898193394
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
TrainingDataPro/generated-passports-segmentation | 2023-10-10T10:27:32.000Z | [
"task_categories:image-segmentation",
"language:en",
"license:cc-by-nc-nd-4.0",
"finance",
"legal",
"code",
"region:us"
] | TrainingDataPro | null | null | null | 1 | 0 | ---
license: cc-by-nc-nd-4.0
task_categories:
- image-segmentation
language:
- en
tags:
- finance
- legal
- code
---
# GENERATED USA Passports Segmentation
The dataset contains a collection of images representing **GENERATED USA Passports**. Each passport image is segmented into different zones, including the **passport zone, photo, name, surname, date of birth, sex, nationality, passport number, and MRZ (Machine Readable Zone)**.
The dataset can be utilized for *computer vision, object detection, data extraction and machine learning models*.
Generated passports can assist in conducting research without accessing or compromising real user data that is often sensitive and subject to privacy regulations. **Synthetic data generation** allows researchers to *develop and refine models using simulated passport data without risking privacy leaks*.

### The dataset is solely for informational or educational purposes and should not be used for any fraudulent or deceptive activities.
# Get the dataset
### This is just an example of the data
Leave a request on [**https://trainingdata.pro/data-market**](https://trainingdata.pro/data-market?utm_source=huggingface&utm_medium=cpc&utm_campaign=generated-passports-segmentation) to discuss your requirements, learn about the price and buy the dataset.
# Dataset structure
- **images** - contains of generated images of passports
- **labels** - includes segmentation masks created for the original images
- **annotations.xml** - contains coordinates of the polygons, created for the original photo
# Data Format
Each image from `images` folder is accompanied by an XML-annotation in the `annotations.xml` file indicating the coordinates of the polygons and labels . For each point, the x and y coordinates are provided.
### Сlasses:
- **passport**: passport zone,
- **photo**: photo of the person,
- **number**: number of the passport,
- **name**: name of the person,
- **surname**: surname of the person,
- **date_of_birth**: date of birth of the person,
- **nationality**: nationality of the person,
- **sex**: sex of the person,
- **mrz**: mrz in the passport,
- **other**: other text in the passport
# Example of XML file structure

# GENERATED USA Passports Segmentation might be made in accordance with your requirements.
## **[TrainingData](https://trainingdata.pro/data-market?utm_source=huggingface&utm_medium=cpc&utm_campaign=generated-passports-segmentation)** provides high-quality data annotation tailored to your needs
More datasets in TrainingData's Kaggle account: **https://www.kaggle.com/trainingdatapro/datasets**
TrainingData's GitHub: **https://github.com/Trainingdata-datamarket/TrainingData_All_datasets** |
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r16-q_k_v_o | 2023-10-10T10:28:29.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r16-q_k_v_o
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r16-q_k_v_o](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r16-q_k_v_o)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r16-q_k_v_o\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T10:27:05.033674](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r16-q_k_v_o/blob/main/results_2023-10-10T10-27-05.033674.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5401496862310199,\n\
\ \"acc_stderr\": 0.034857027975596505,\n \"acc_norm\": 0.5442313544698612,\n\
\ \"acc_norm_stderr\": 0.03483758281921596,\n \"mc1\": 0.2962056303549572,\n\
\ \"mc1_stderr\": 0.015983595101811392,\n \"mc2\": 0.43024105261650214,\n\
\ \"mc2_stderr\": 0.014310654215426323\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5520477815699659,\n \"acc_stderr\": 0.014532011498211672,\n\
\ \"acc_norm\": 0.5870307167235495,\n \"acc_norm_stderr\": 0.014388344935398326\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6107349133638718,\n\
\ \"acc_stderr\": 0.004865871290143341,\n \"acc_norm\": 0.8165704043019318,\n\
\ \"acc_norm_stderr\": 0.003862273626504544\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621502,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621502\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5394736842105263,\n \"acc_stderr\": 0.04056242252249034,\n\
\ \"acc_norm\": 0.5394736842105263,\n \"acc_norm_stderr\": 0.04056242252249034\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6075471698113207,\n \"acc_stderr\": 0.03005258057955785,\n\
\ \"acc_norm\": 0.6075471698113207,\n \"acc_norm_stderr\": 0.03005258057955785\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5902777777777778,\n\
\ \"acc_stderr\": 0.04112490974670787,\n \"acc_norm\": 0.5902777777777778,\n\
\ \"acc_norm_stderr\": 0.04112490974670787\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.49710982658959535,\n\
\ \"acc_stderr\": 0.038124005659748335,\n \"acc_norm\": 0.49710982658959535,\n\
\ \"acc_norm_stderr\": 0.038124005659748335\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201943,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201943\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.39148936170212767,\n \"acc_stderr\": 0.03190701242326812,\n\
\ \"acc_norm\": 0.39148936170212767,\n \"acc_norm_stderr\": 0.03190701242326812\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n\
\ \"acc_stderr\": 0.045144961328736334,\n \"acc_norm\": 0.35964912280701755,\n\
\ \"acc_norm_stderr\": 0.045144961328736334\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.41379310344827586,\n \"acc_stderr\": 0.041042692118062316,\n\
\ \"acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.041042692118062316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3439153439153439,\n \"acc_stderr\": 0.024464426625596433,\n \"\
acc_norm\": 0.3439153439153439,\n \"acc_norm_stderr\": 0.024464426625596433\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\
\ \"acc_stderr\": 0.041905964388711366,\n \"acc_norm\": 0.3253968253968254,\n\
\ \"acc_norm_stderr\": 0.041905964388711366\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.632258064516129,\n\
\ \"acc_stderr\": 0.027430866579973467,\n \"acc_norm\": 0.632258064516129,\n\
\ \"acc_norm_stderr\": 0.027430866579973467\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4433497536945813,\n \"acc_stderr\": 0.03495334582162933,\n\
\ \"acc_norm\": 0.4433497536945813,\n \"acc_norm_stderr\": 0.03495334582162933\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.03663974994391244,\n\
\ \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.03663974994391244\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.696969696969697,\n \"acc_stderr\": 0.032742879140268674,\n \"\
acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.032742879140268674\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7979274611398963,\n \"acc_stderr\": 0.02897908979429673,\n\
\ \"acc_norm\": 0.7979274611398963,\n \"acc_norm_stderr\": 0.02897908979429673\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.48717948717948717,\n \"acc_stderr\": 0.025342671293807257,\n\
\ \"acc_norm\": 0.48717948717948717,\n \"acc_norm_stderr\": 0.025342671293807257\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608466,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608466\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5840336134453782,\n \"acc_stderr\": 0.03201650100739611,\n \
\ \"acc_norm\": 0.5840336134453782,\n \"acc_norm_stderr\": 0.03201650100739611\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.728440366972477,\n \"acc_stderr\": 0.019069098363191428,\n \"\
acc_norm\": 0.728440366972477,\n \"acc_norm_stderr\": 0.019069098363191428\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321616,\n \"\
acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321616\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7254901960784313,\n \"acc_stderr\": 0.031321798030832904,\n \"\
acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.031321798030832904\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6877637130801688,\n \"acc_stderr\": 0.030165137867847008,\n \
\ \"acc_norm\": 0.6877637130801688,\n \"acc_norm_stderr\": 0.030165137867847008\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5964125560538116,\n\
\ \"acc_stderr\": 0.03292802819330314,\n \"acc_norm\": 0.5964125560538116,\n\
\ \"acc_norm_stderr\": 0.03292802819330314\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.549618320610687,\n \"acc_stderr\": 0.04363643698524779,\n\
\ \"acc_norm\": 0.549618320610687,\n \"acc_norm_stderr\": 0.04363643698524779\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516303,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516303\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6481481481481481,\n\
\ \"acc_stderr\": 0.046166311118017125,\n \"acc_norm\": 0.6481481481481481,\n\
\ \"acc_norm_stderr\": 0.046166311118017125\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6196319018404908,\n \"acc_stderr\": 0.03814269893261837,\n\
\ \"acc_norm\": 0.6196319018404908,\n \"acc_norm_stderr\": 0.03814269893261837\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285713,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285713\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7863247863247863,\n\
\ \"acc_stderr\": 0.026853450377009168,\n \"acc_norm\": 0.7863247863247863,\n\
\ \"acc_norm_stderr\": 0.026853450377009168\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7496807151979565,\n\
\ \"acc_stderr\": 0.015491088951494581,\n \"acc_norm\": 0.7496807151979565,\n\
\ \"acc_norm_stderr\": 0.015491088951494581\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5982658959537572,\n \"acc_stderr\": 0.02639410417764363,\n\
\ \"acc_norm\": 0.5982658959537572,\n \"acc_norm_stderr\": 0.02639410417764363\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37094972067039106,\n\
\ \"acc_stderr\": 0.016155910721341777,\n \"acc_norm\": 0.37094972067039106,\n\
\ \"acc_norm_stderr\": 0.016155910721341777\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.02818059632825929,\n\
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.02818059632825929\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.639871382636656,\n\
\ \"acc_stderr\": 0.027264297599804012,\n \"acc_norm\": 0.639871382636656,\n\
\ \"acc_norm_stderr\": 0.027264297599804012\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6234567901234568,\n \"acc_stderr\": 0.02695934451874778,\n\
\ \"acc_norm\": 0.6234567901234568,\n \"acc_norm_stderr\": 0.02695934451874778\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.41843971631205673,\n \"acc_stderr\": 0.02942799403941999,\n \
\ \"acc_norm\": 0.41843971631205673,\n \"acc_norm_stderr\": 0.02942799403941999\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4165580182529335,\n\
\ \"acc_stderr\": 0.01259115324505739,\n \"acc_norm\": 0.4165580182529335,\n\
\ \"acc_norm_stderr\": 0.01259115324505739\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5404411764705882,\n \"acc_stderr\": 0.030273325077345755,\n\
\ \"acc_norm\": 0.5404411764705882,\n \"acc_norm_stderr\": 0.030273325077345755\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5490196078431373,\n \"acc_stderr\": 0.020130388312904528,\n \
\ \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.020130388312904528\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\
\ \"acc_stderr\": 0.04709306978661895,\n \"acc_norm\": 0.5909090909090909,\n\
\ \"acc_norm_stderr\": 0.04709306978661895\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5510204081632653,\n \"acc_stderr\": 0.03184213866687579,\n\
\ \"acc_norm\": 0.5510204081632653,\n \"acc_norm_stderr\": 0.03184213866687579\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6766169154228856,\n\
\ \"acc_stderr\": 0.03307615947979034,\n \"acc_norm\": 0.6766169154228856,\n\
\ \"acc_norm_stderr\": 0.03307615947979034\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3855421686746988,\n\
\ \"acc_stderr\": 0.0378913442461155,\n \"acc_norm\": 0.3855421686746988,\n\
\ \"acc_norm_stderr\": 0.0378913442461155\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n\
\ \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2962056303549572,\n\
\ \"mc1_stderr\": 0.015983595101811392,\n \"mc2\": 0.43024105261650214,\n\
\ \"mc2_stderr\": 0.014310654215426323\n }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r16-q_k_v_o
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|arc:challenge|25_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hellaswag|10_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T10-27-05.033674.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T10-27-05.033674.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T10-27-05.033674.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T10-27-05.033674.parquet'
- config_name: results
data_files:
- split: 2023_10_10T10_27_05.033674
path:
- results_2023-10-10T10-27-05.033674.parquet
- split: latest
path:
- results_2023-10-10T10-27-05.033674.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r16-q_k_v_o
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r16-q_k_v_o
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r16-q_k_v_o](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r16-q_k_v_o) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r16-q_k_v_o",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T10:27:05.033674](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r16-q_k_v_o/blob/main/results_2023-10-10T10-27-05.033674.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5401496862310199,
"acc_stderr": 0.034857027975596505,
"acc_norm": 0.5442313544698612,
"acc_norm_stderr": 0.03483758281921596,
"mc1": 0.2962056303549572,
"mc1_stderr": 0.015983595101811392,
"mc2": 0.43024105261650214,
"mc2_stderr": 0.014310654215426323
},
"harness|arc:challenge|25": {
"acc": 0.5520477815699659,
"acc_stderr": 0.014532011498211672,
"acc_norm": 0.5870307167235495,
"acc_norm_stderr": 0.014388344935398326
},
"harness|hellaswag|10": {
"acc": 0.6107349133638718,
"acc_stderr": 0.004865871290143341,
"acc_norm": 0.8165704043019318,
"acc_norm_stderr": 0.003862273626504544
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621502,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621502
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5394736842105263,
"acc_stderr": 0.04056242252249034,
"acc_norm": 0.5394736842105263,
"acc_norm_stderr": 0.04056242252249034
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6075471698113207,
"acc_stderr": 0.03005258057955785,
"acc_norm": 0.6075471698113207,
"acc_norm_stderr": 0.03005258057955785
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5902777777777778,
"acc_stderr": 0.04112490974670787,
"acc_norm": 0.5902777777777778,
"acc_norm_stderr": 0.04112490974670787
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.49710982658959535,
"acc_stderr": 0.038124005659748335,
"acc_norm": 0.49710982658959535,
"acc_norm_stderr": 0.038124005659748335
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201943,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201943
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.39148936170212767,
"acc_stderr": 0.03190701242326812,
"acc_norm": 0.39148936170212767,
"acc_norm_stderr": 0.03190701242326812
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.045144961328736334,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.045144961328736334
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.041042692118062316,
"acc_norm": 0.41379310344827586,
"acc_norm_stderr": 0.041042692118062316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3439153439153439,
"acc_stderr": 0.024464426625596433,
"acc_norm": 0.3439153439153439,
"acc_norm_stderr": 0.024464426625596433
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.041905964388711366,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.041905964388711366
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.632258064516129,
"acc_stderr": 0.027430866579973467,
"acc_norm": 0.632258064516129,
"acc_norm_stderr": 0.027430866579973467
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4433497536945813,
"acc_stderr": 0.03495334582162933,
"acc_norm": 0.4433497536945813,
"acc_norm_stderr": 0.03495334582162933
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.03663974994391244,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.03663974994391244
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.032742879140268674,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.032742879140268674
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7979274611398963,
"acc_stderr": 0.02897908979429673,
"acc_norm": 0.7979274611398963,
"acc_norm_stderr": 0.02897908979429673
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.48717948717948717,
"acc_stderr": 0.025342671293807257,
"acc_norm": 0.48717948717948717,
"acc_norm_stderr": 0.025342671293807257
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.028406533090608466,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.028406533090608466
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5840336134453782,
"acc_stderr": 0.03201650100739611,
"acc_norm": 0.5840336134453782,
"acc_norm_stderr": 0.03201650100739611
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.03879687024073327,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.03879687024073327
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.728440366972477,
"acc_stderr": 0.019069098363191428,
"acc_norm": 0.728440366972477,
"acc_norm_stderr": 0.019069098363191428
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321616,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321616
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.031321798030832904,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.031321798030832904
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6877637130801688,
"acc_stderr": 0.030165137867847008,
"acc_norm": 0.6877637130801688,
"acc_norm_stderr": 0.030165137867847008
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5964125560538116,
"acc_stderr": 0.03292802819330314,
"acc_norm": 0.5964125560538116,
"acc_norm_stderr": 0.03292802819330314
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.549618320610687,
"acc_stderr": 0.04363643698524779,
"acc_norm": 0.549618320610687,
"acc_norm_stderr": 0.04363643698524779
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.046166311118017125,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.046166311118017125
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6196319018404908,
"acc_stderr": 0.03814269893261837,
"acc_norm": 0.6196319018404908,
"acc_norm_stderr": 0.03814269893261837
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285713,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285713
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7863247863247863,
"acc_stderr": 0.026853450377009168,
"acc_norm": 0.7863247863247863,
"acc_norm_stderr": 0.026853450377009168
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7496807151979565,
"acc_stderr": 0.015491088951494581,
"acc_norm": 0.7496807151979565,
"acc_norm_stderr": 0.015491088951494581
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5982658959537572,
"acc_stderr": 0.02639410417764363,
"acc_norm": 0.5982658959537572,
"acc_norm_stderr": 0.02639410417764363
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37094972067039106,
"acc_stderr": 0.016155910721341777,
"acc_norm": 0.37094972067039106,
"acc_norm_stderr": 0.016155910721341777
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.02818059632825929,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.02818059632825929
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.639871382636656,
"acc_stderr": 0.027264297599804012,
"acc_norm": 0.639871382636656,
"acc_norm_stderr": 0.027264297599804012
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6234567901234568,
"acc_stderr": 0.02695934451874778,
"acc_norm": 0.6234567901234568,
"acc_norm_stderr": 0.02695934451874778
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.41843971631205673,
"acc_stderr": 0.02942799403941999,
"acc_norm": 0.41843971631205673,
"acc_norm_stderr": 0.02942799403941999
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4165580182529335,
"acc_stderr": 0.01259115324505739,
"acc_norm": 0.4165580182529335,
"acc_norm_stderr": 0.01259115324505739
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5404411764705882,
"acc_stderr": 0.030273325077345755,
"acc_norm": 0.5404411764705882,
"acc_norm_stderr": 0.030273325077345755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.020130388312904528,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.020130388312904528
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661895,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661895
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5510204081632653,
"acc_stderr": 0.03184213866687579,
"acc_norm": 0.5510204081632653,
"acc_norm_stderr": 0.03184213866687579
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6766169154228856,
"acc_stderr": 0.03307615947979034,
"acc_norm": 0.6766169154228856,
"acc_norm_stderr": 0.03307615947979034
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3855421686746988,
"acc_stderr": 0.0378913442461155,
"acc_norm": 0.3855421686746988,
"acc_norm_stderr": 0.0378913442461155
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2962056303549572,
"mc1_stderr": 0.015983595101811392,
"mc2": 0.43024105261650214,
"mc2_stderr": 0.014310654215426323
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
pechaut/cairo-code | 2023-10-10T10:34:07.000Z | [
"license:apache-2.0",
"region:us"
] | pechaut | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
fmeleard/moody_data | 2023-10-10T10:37:19.000Z | [
"task_categories:summarization",
"task_categories:conversational",
"language:fr",
"license:apache-2.0",
"region:us"
] | fmeleard | null | null | null | 0 | 0 | ---
license: apache-2.0
task_categories:
- summarization
- conversational
language:
- fr
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
TokenWhisperer/laptops-2014-v2 | 2023-10-10T10:38:09.000Z | [
"region:us"
] | TokenWhisperer | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2851637
num_examples: 3002
- name: test
num_bytes: 713531
num_examples: 786
download_size: 475054
dataset_size: 3565168
---
# Dataset Card for "laptops-2014-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pechaut/cairo-instruct | 2023-10-10T10:42:03.000Z | [
"license:apache-2.0",
"region:us"
] | pechaut | null | null | null | 0 | 0 | ---
license: apache-2.0
dataset_info:
features:
- name: text
dtype: string
splits:
- name: eval
num_bytes: 856
num_examples: 5
- name: train
num_bytes: 61524
num_examples: 197
download_size: 31483
dataset_size: 62380
configs:
- config_name: default
data_files:
- split: eval
path: data/eval-*
- split: train
path: data/train-*
---
|
JzJd/post-llm | 2023-10-10T10:59:15.000Z | [
"license:afl-3.0",
"region:us"
] | JzJd | null | null | null | 0 | 0 | ---
license: afl-3.0
---
|
junaav/flower512 | 2023-10-10T22:18:21.000Z | [
"license:other",
"region:us"
] | junaav | null | null | null | 0 | 0 | ---
license: other
license_name: others
license_link: LICENSE
---
|
kubegems/default | 2023-10-10T10:56:35.000Z | [
"license:apache-2.0",
"region:us"
] | kubegems | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
Rricardo/benahavis | 2023-10-10T15:06:13.000Z | [
"region:us"
] | Rricardo | null | null | null | 0 | 0 | Entry not found |
reza-alipour/yelp_small | 2023-10-10T11:04:23.000Z | [
"region:us"
] | reza-alipour | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: label
dtype:
class_label:
names:
'0': 1 star
'1': 2 star
'2': 3 stars
'3': 4 stars
'4': 5 stars
- name: text
dtype: string
splits:
- name: train
num_bytes: 3750914
num_examples: 5000
download_size: 2320162
dataset_size: 3750914
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "yelp_small"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yangwang825/sst2-textfooler-5 | 2023-10-10T11:18:40.000Z | [
"region:us"
] | yangwang825 | null | null | null | 0 | 0 | Entry not found |
indolem/IndoMMLU | 2023-10-10T17:45:30.000Z | [
"license:cc-by-2.0",
"arxiv:2310.04928",
"arxiv:2112.10668",
"arxiv:2302.13971",
"region:us"
] | indolem | null | @inproceedings{koto-etal-2023-indommlu,
title = "Large Language Models Only Pass Primary School Exams in {I}ndonesia: A Comprehensive Test on {I}ndo{MMLU}",
author = "Fajri Koto and Nurul Aisyah and Haonan Li and Timothy Baldwin",
booktitle = "Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing (EMNLP)",
month = December,
year = "2023",
address = "Singapore",
publisher = "Association for Computational Linguistics",
} | null | 1 | 0 | ---
license: cc-by-2.0
---
# IndoMMLU
<!---
[](https://github.com/internLM/OpenCompass/) [](https://github.com/EleutherAI/lm-evaluation-harness)
-->
<p align="center"> <img src="https://raw.githubusercontent.com/fajri91/eval_picts/master/IndoMMLU-Bar.png" style="width: 100%;" id="title-icon">
</p>
<p align="center"> <a href="http://www.fajrikoto.com" target="_blank">Fajri Koto</a>, <a href="https://www.linkedin.com/in/nuaisyah/" target="_blank">Nurul Aisyah</a>, <a href="https://haonan-li.github.io/" target="_blank">Haonan Li</a>, <a href="https://people.eng.unimelb.edu.au/tbaldwin/" target="_blank">Timothy Baldwin</a> </p>
<h4 align="center">
<p align="center" style="display: flex; flex-direction: row; justify-content: center; align-items: center">
📄 <a href="https://arxiv.org/abs/2310.04928" target="_blank" style="margin-right: 15px; margin-left: 10px">Paper</a> •
🏆 <a href="https://github.com/fajri91/IndoMMLU/blob/main/README_EN.md#evaluation" target="_blank" style="margin-left: 10px">Leaderboard</a> •
🤗 <a href="https://huggingface.co/datasets/indolem/indommlu" target="_blank" style="margin-left: 10px">Dataset</a>
</p>
</h4>
## Introduction
We introduce IndoMMLU, the first multi-task language understanding benchmark for Indonesian culture and languages,
which consists of questions from primary school to university entrance exams in Indonesia. By employing professional teachers,
we obtain 14,906 questions across 63 tasks and education levels, with 46\% of the questions focusing on assessing proficiency
in the Indonesian language and knowledge of nine local languages and cultures in Indonesia.
<p align="left"> <img src="https://github.com/fajri91/eval_picts/blob/master/IndoMMLU-dist.png?raw=true" style="width: 500px;" id="title-icon"> </p>
## Subjects
| Level | Subjects |
|-----------|------------------------------------|
| SD (Primary School) | Science, Social science, Civics, Indonesian Language, Balinese, Makassarese, Banjarese, Lampungic, Madurese, Sundanese, Javanese, Dayak Ngaju, Minangkabau culture, Art, Sports, Islam religion, Christian religion, Hindu religion |
| SMP (Junior High School) | Science, Social science, Civics, Indonesian Language, Balinese, Makassarese, Banjarese, Lampungic, Madurese, Sundanese, Javanese, Minangkabau culture, Art, Sports, Islam religion, Christian religion, Hindu religion |
| SMA (Senior High School) | Physics, Chemistry, Biology, Geography, Sociology, Economics, History, Civics, Indonesian Language, Balinese, Makassarese, Banjarese, Lampungic, Madurese, Sundanese, Javanese, Art, Sports, Islam religion, Christian religion, Hindu religion |
University Entrance Test | Chemistry, Biology, Geography, Sociology, Economics, History, Indonesian Language |
We categorize the collected questions into different subject areas, including: (1) STEM (Science, Technology, Engineering, and Mathematics); (2) Social Science; (3) Humanities; (4) Indonesian Language; and (5) Local Languages and Cultures.
## Examples
These questions are written in Indonesian. For local language subjects, some are written in the local languages. The English version is for illustrative purposes only.
<p align="left">
<img src="https://github.com/fajri91/eval_picts/blob/master/min_example.png?raw=true" style="width: 400px;" id="title-icon">
</p>
## Evaluation
We evaluate 24 multilingual LLMs of different sizes in zero-shot and few-shot settings. This includes [GPT-3.5 (ChatGPT)](https://chat.openai.com/), [XGLM](https://arxiv.org/abs/2112.10668), [Falcon](https://falconllm.tii.ae/), [BLOOMZ](https://huggingface.co/bigscience/bloomz), [mT0](https://huggingface.co/bigscience/bloomz), [LLaMA](https://arxiv.org/abs/2302.13971), and [Bactrian-X](https://github.com/mbzuai-nlp/bactrian-x). Prior to the question and multiple-choice options, we add a simple prompt in the Indonesian language:
```
Ini adalah soal [subject] untuk [level]. Pilihlah salah satu jawaban yang dianggap benar!
English Translation: This is a [subject] question for [level]. Please choose the correct answer!
```
#### Zero-shot Evaluation
| Model (#param) | STEM | Social Science | Humanities | Indonesian Lang. | Local L. Culture | Average |
|---------------------|------|----------|-------------|---------|----------|---------|
| Random | 21.9 | 23.4 | 23.5 | 24.4 | 26.6 | 24.4 |
| [GPT-3.5 (175B)](https://chat.openai.com/) | **54.3** | **62.5** | **64.0** | **62.2** | 39.3 | **53.2** |
| [XGLM (564M)](https://huggingface.co/facebook/xglm-564M) | 22.1 | 23.0 | 25.6 | 25.6 | 27.5 | 25.2 |
| [XGLM (1.7B)](https://huggingface.co/facebook/xglm-1.7B) | 20.9 | 23.0 | 24.6 | 24.8 | 26.6 | 24.4 |
| [XGLM (2.9B)](https://huggingface.co/facebook/xglm-2.9B) | 22.9 | 23.2 | 25.4 | 26.3 | 27.2 | 25.2 |
| [XGLM (4.5B)](https://huggingface.co/facebook/xglm-4.5B) | 21.8 | 23.1 | 25.6 | 25.8 | 27.1 | 25.0 |
| [XGLM (7.5B)](https://huggingface.co/facebook/xglm-7.5B) | 22.7 | 21.7 | 23.6 | 24.5 | 27.5 | 24.5 |
| [Falcon (7B)](https://huggingface.co/tiiuae/falcon-7b) | 22.1 | 22.9 | 25.5 | 25.7 | 27.5 | 25.1 |
| [Falcon (40B)](https://huggingface.co/tiiuae/falcon-40b) | 30.2 | 34.8 | 34.8 | 34.9 | 29.2 | 32.1 |
| [BLOOMZ (560M)](https://huggingface.co/bigscience/bloomz-560m) | 22.9 | 23.6 | 23.2 | 24.2 | 25.1 | 24.0 |
| [BLOOMZ (1.1B)](https://huggingface.co/bigscience/bloomz-1b1) | 20.4 | 21.4 | 21.1 | 23.5 | 24.7 | 22.4 |
| [BLOOMZ (1.7B)](https://huggingface.co/bigscience/bloomz-1b7) | 31.5 | 39.3 | 38.3 | 42.8 | 29.4 | 34.4 |
| [BLOOMZ (3B)](https://huggingface.co/bigscience/bloomz-3b) | 33.5 | 44.5 | 39.7 | 46.7 | 29.8 | 36.4 |
| [BLOOMZ (7.1B)](https://huggingface.co/bigscience/bloomz-7b1) | 37.1 | 46.7 | 44.0 | 49.1 | 28.2 | 38.0 |
| [mT0<sub>small</sub> (300M)](https://huggingface.co/bigscience/mt0-small) | 21.8 | 21.4 | 25.7 | 25.1 | 27.6 | 24.9 |
| [mT0<sub>base</sub> (580M)](https://huggingface.co/bigscience/mt0-base) | 22.6 | 22.6 | 25.7 | 25.6 | 26.9 | 25.0 |
| [mT0<sub>large</sub> (1.2B)](https://huggingface.co/bigscience/mt0-large) | 22.0 | 23.4 | 25.1 | 27.3 | 27.6 | 25.2 |
| [mT0<sub>xl</sub> (3.7B)](https://huggingface.co/bigscience/mt0-xl) | 31.4 | 42.9 | 41.0 | 47.8 | 35.7 | 38.2 |
| [mT0<sub>xxl</sub> (13B)](https://huggingface.co/bigscience/mt0-xxl) | 33.5 | 46.2 | 47.9 | 52.6 | **39.6** | 42.5 |
| [LLaMA (7B)](https://arxiv.org/abs/2302.13971) | 22.8 | 23.1 | 25.1 | 26.7 | 27.6 | 25.3 |
| [LLaMA (13B)](https://arxiv.org/abs/2302.13971) | 24.1 | 23.0 | 24.4 | 29.5 | 26.7 | 25.3 |
| [LLaMA (30B)](https://arxiv.org/abs/2302.13971) | 25.4 | 23.5 | 25.9 | 28.4 | 28.7 | 26.5 |
| [LLaMA (65B)](https://arxiv.org/abs/2302.13971) | 33.0 | 37.7 | 40.8 | 41.4 | 32.1 | 35.8 |
| [Bactrian-X-LLaMA (7B)](https://github.com/mbzuai-nlp/bactrian-x) | 23.3 | 24.0 | 26.0 | 26.1 | 27.5 | 25.7 |
| [Bactrian-X-LLaMA (13B)](https://github.com/mbzuai-nlp/bactrian-x) | 28.3 | 29.9 | 32.8 | 35.2 | 29.2 | 30.3 |
#### GPT-3.5 performance (% accuracy) across different education levels
<p align="left">
<img src="https://github.com/fajri91/eval_picts/blob/master/IndoMMLU-result.png?raw=true" style="width: 370px;" id="title-icon">
</p>
Red indicates that the score is below the minimum passing threshold of 65, while green signifies a score at or above this minimum. We can observe that ChatGPT mostly passes a score of 65 in Indonesian primary school exams.
#### Few-shot Evaluation
<p align="left">
<img src="https://github.com/fajri91/eval_picts/blob/master/plot_fewshot.png?raw=true" style="width: 380px;" id="title-icon">
</p>
## Data
Each question in the dataset is a multiple-choice question with up to 5 choices and only one choice as the correct answer.
We provide our dataset according to each subject in [data](data) folder. You can also access our dataset via [Hugging Face](https://huggingface.co/datasets/indolem/indommlu).
<!--
#### Quick Use
Our dataset has been added to [lm-evaluation-harness](https://github.com/EleutherAI/lm-evaluation-harness) and [OpenCompass](https://github.com/InternLM/opencompass), you can evaluate your model via these open-source tools.
-->
#### Evaluation
The code for the evaluation of each model we used is in `evaluate.py`, and the code to run them is listed in `run.sh`.
## Citation
```
@inproceedings{koto-etal-2023-indommlu,
title = "Large Language Models Only Pass Primary School Exams in {I}ndonesia: A Comprehensive Test on {I}ndo{MMLU}",
author = "Fajri Koto and Nurul Aisyah and Haonan Li and Timothy Baldwin",
booktitle = "Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing (EMNLP)",
month = December,
year = "2023",
address = "Singapore",
publisher = "Association for Computational Linguistics",
}
```
## License
The IndoMMLU dataset is licensed under a
[Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License](http://creativecommons.org/licenses/by-nc-sa/4.0/).
|
yangwang825/sst2-textbugger-5 | 2023-10-10T11:22:09.000Z | [
"region:us"
] | yangwang825 | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_mncai__Mistral-7B-OpenOrca-1k | 2023-10-10T11:20:38.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of mncai/Mistral-7B-OpenOrca-1k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mncai/Mistral-7B-OpenOrca-1k](https://huggingface.co/mncai/Mistral-7B-OpenOrca-1k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mncai__Mistral-7B-OpenOrca-1k\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T11:19:13.410150](https://huggingface.co/datasets/open-llm-leaderboard/details_mncai__Mistral-7B-OpenOrca-1k/blob/main/results_2023-10-10T11-19-13.410150.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6220828368688038,\n\
\ \"acc_stderr\": 0.0333381086288331,\n \"acc_norm\": 0.6259740111463021,\n\
\ \"acc_norm_stderr\": 0.03331442785224609,\n \"mc1\": 0.37821297429620565,\n\
\ \"mc1_stderr\": 0.016976335907546866,\n \"mc2\": 0.5296423544101955,\n\
\ \"mc2_stderr\": 0.015339874902349726\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5930034129692833,\n \"acc_stderr\": 0.014356399418009126,\n\
\ \"acc_norm\": 0.6296928327645052,\n \"acc_norm_stderr\": 0.01411129875167495\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6537542322246565,\n\
\ \"acc_stderr\": 0.004748003276466209,\n \"acc_norm\": 0.8466440948018323,\n\
\ \"acc_norm_stderr\": 0.003595938124166216\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.038781398887976104,\n\
\ \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.038781398887976104\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n\
\ \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.38095238095238093,\n \"acc_stderr\": 0.025010749116137595,\n \"\
acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.025010749116137595\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7451612903225806,\n\
\ \"acc_stderr\": 0.024790118459332208,\n \"acc_norm\": 0.7451612903225806,\n\
\ \"acc_norm_stderr\": 0.024790118459332208\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526066,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.047258156262526066\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.0347769116216366,\n\
\ \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.0347769116216366\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026705,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026705\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635484,\n\
\ \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635484\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35555555555555557,\n \"acc_stderr\": 0.02918571494985741,\n \
\ \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.02918571494985741\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8256880733944955,\n \"acc_stderr\": 0.01626567563201036,\n \"\
acc_norm\": 0.8256880733944955,\n \"acc_norm_stderr\": 0.01626567563201036\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502326,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502326\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.803921568627451,\n \"acc_stderr\": 0.027865942286639318,\n \"\
acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639318\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676177,\n \
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676177\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728744,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728744\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516303,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516303\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.039578354719809784,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.039578354719809784\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.021901905115073325,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.021901905115073325\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8109833971902938,\n\
\ \"acc_stderr\": 0.014000791294407006,\n \"acc_norm\": 0.8109833971902938,\n\
\ \"acc_norm_stderr\": 0.014000791294407006\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.02541600377316555,\n\
\ \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.02541600377316555\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2849162011173184,\n\
\ \"acc_stderr\": 0.015096222302469799,\n \"acc_norm\": 0.2849162011173184,\n\
\ \"acc_norm_stderr\": 0.015096222302469799\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826517,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826517\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n\
\ \"acc_stderr\": 0.026385273703464492,\n \"acc_norm\": 0.684887459807074,\n\
\ \"acc_norm_stderr\": 0.026385273703464492\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7253086419753086,\n \"acc_stderr\": 0.024836057868294677,\n\
\ \"acc_norm\": 0.7253086419753086,\n \"acc_norm_stderr\": 0.024836057868294677\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.450354609929078,\n \"acc_stderr\": 0.02968010556502904,\n \
\ \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.02968010556502904\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45045632333767927,\n\
\ \"acc_stderr\": 0.012707390438502346,\n \"acc_norm\": 0.45045632333767927,\n\
\ \"acc_norm_stderr\": 0.012707390438502346\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.02806499816704009,\n\
\ \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.02806499816704009\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6388888888888888,\n \"acc_stderr\": 0.01943177567703731,\n \
\ \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.01943177567703731\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6857142857142857,\n \"acc_stderr\": 0.02971932942241747,\n\
\ \"acc_norm\": 0.6857142857142857,\n \"acc_norm_stderr\": 0.02971932942241747\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.02553843336857833,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.02553843336857833\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036623,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036623\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37821297429620565,\n\
\ \"mc1_stderr\": 0.016976335907546866,\n \"mc2\": 0.5296423544101955,\n\
\ \"mc2_stderr\": 0.015339874902349726\n }\n}\n```"
repo_url: https://huggingface.co/mncai/Mistral-7B-OpenOrca-1k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|arc:challenge|25_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hellaswag|10_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-19-13.410150.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-19-13.410150.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T11-19-13.410150.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T11-19-13.410150.parquet'
- config_name: results
data_files:
- split: 2023_10_10T11_19_13.410150
path:
- results_2023-10-10T11-19-13.410150.parquet
- split: latest
path:
- results_2023-10-10T11-19-13.410150.parquet
---
# Dataset Card for Evaluation run of mncai/Mistral-7B-OpenOrca-1k
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/mncai/Mistral-7B-OpenOrca-1k
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [mncai/Mistral-7B-OpenOrca-1k](https://huggingface.co/mncai/Mistral-7B-OpenOrca-1k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mncai__Mistral-7B-OpenOrca-1k",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T11:19:13.410150](https://huggingface.co/datasets/open-llm-leaderboard/details_mncai__Mistral-7B-OpenOrca-1k/blob/main/results_2023-10-10T11-19-13.410150.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6220828368688038,
"acc_stderr": 0.0333381086288331,
"acc_norm": 0.6259740111463021,
"acc_norm_stderr": 0.03331442785224609,
"mc1": 0.37821297429620565,
"mc1_stderr": 0.016976335907546866,
"mc2": 0.5296423544101955,
"mc2_stderr": 0.015339874902349726
},
"harness|arc:challenge|25": {
"acc": 0.5930034129692833,
"acc_stderr": 0.014356399418009126,
"acc_norm": 0.6296928327645052,
"acc_norm_stderr": 0.01411129875167495
},
"harness|hellaswag|10": {
"acc": 0.6537542322246565,
"acc_stderr": 0.004748003276466209,
"acc_norm": 0.8466440948018323,
"acc_norm_stderr": 0.003595938124166216
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6513157894736842,
"acc_stderr": 0.038781398887976104,
"acc_norm": 0.6513157894736842,
"acc_norm_stderr": 0.038781398887976104
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.025010749116137595,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.025010749116137595
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7451612903225806,
"acc_stderr": 0.024790118459332208,
"acc_norm": 0.7451612903225806,
"acc_norm_stderr": 0.024790118459332208
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526066,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526066
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026705,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026705
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635484,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.024078696580635484
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.02918571494985741,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.02918571494985741
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8256880733944955,
"acc_stderr": 0.01626567563201036,
"acc_norm": 0.8256880733944955,
"acc_norm_stderr": 0.01626567563201036
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03388857118502326,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03388857118502326
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639318,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676177,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676177
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776679,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776679
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728744,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728744
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.039578354719809784,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.039578354719809784
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073325,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073325
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8109833971902938,
"acc_stderr": 0.014000791294407006,
"acc_norm": 0.8109833971902938,
"acc_norm_stderr": 0.014000791294407006
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.02541600377316555,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.02541600377316555
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2849162011173184,
"acc_stderr": 0.015096222302469799,
"acc_norm": 0.2849162011173184,
"acc_norm_stderr": 0.015096222302469799
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826517,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826517
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.026385273703464492,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.026385273703464492
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7253086419753086,
"acc_stderr": 0.024836057868294677,
"acc_norm": 0.7253086419753086,
"acc_norm_stderr": 0.024836057868294677
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.02968010556502904,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.02968010556502904
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45045632333767927,
"acc_stderr": 0.012707390438502346,
"acc_norm": 0.45045632333767927,
"acc_norm_stderr": 0.012707390438502346
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.02806499816704009,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.02806499816704009
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.01943177567703731,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.01943177567703731
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6857142857142857,
"acc_stderr": 0.02971932942241747,
"acc_norm": 0.6857142857142857,
"acc_norm_stderr": 0.02971932942241747
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.02553843336857833,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.02553843336857833
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37821297429620565,
"mc1_stderr": 0.016976335907546866,
"mc2": 0.5296423544101955,
"mc2_stderr": 0.015339874902349726
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
yangwang825/sst2-pwws-5 | 2023-10-10T11:27:22.000Z | [
"region:us"
] | yangwang825 | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r8-gate_up_down | 2023-10-10T11:26:27.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-gate_up_down
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-gate_up_down)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r8-gate_up_down\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T11:25:01.199069](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r8-gate_up_down/blob/main/results_2023-10-10T11-25-01.199069.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5495005982268492,\n\
\ \"acc_stderr\": 0.03452729598059282,\n \"acc_norm\": 0.5538011481690731,\n\
\ \"acc_norm_stderr\": 0.034507446304033906,\n \"mc1\": 0.2717258261933905,\n\
\ \"mc1_stderr\": 0.01557284045287583,\n \"mc2\": 0.402271195410076,\n\
\ \"mc2_stderr\": 0.014079017857356345\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5281569965870307,\n \"acc_stderr\": 0.014588204105102205,\n\
\ \"acc_norm\": 0.5716723549488054,\n \"acc_norm_stderr\": 0.014460496367599015\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6113324039036049,\n\
\ \"acc_stderr\": 0.004864513262194313,\n \"acc_norm\": 0.8215494921330412,\n\
\ \"acc_norm_stderr\": 0.0038210900827217115\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5657894736842105,\n \"acc_stderr\": 0.04033565667848319,\n\
\ \"acc_norm\": 0.5657894736842105,\n \"acc_norm_stderr\": 0.04033565667848319\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.030151134457776285,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.030151134457776285\n \
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6180555555555556,\n\
\ \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.6180555555555556,\n\
\ \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n\
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5028901734104047,\n\
\ \"acc_stderr\": 0.038124005659748335,\n \"acc_norm\": 0.5028901734104047,\n\
\ \"acc_norm_stderr\": 0.038124005659748335\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383888,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383888\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322004,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322004\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3306878306878307,\n \"acc_stderr\": 0.024229965298425082,\n \"\
acc_norm\": 0.3306878306878307,\n \"acc_norm_stderr\": 0.024229965298425082\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n\
\ \"acc_stderr\": 0.042639068927951336,\n \"acc_norm\": 0.3492063492063492,\n\
\ \"acc_norm_stderr\": 0.042639068927951336\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6548387096774193,\n\
\ \"acc_stderr\": 0.027045746573534327,\n \"acc_norm\": 0.6548387096774193,\n\
\ \"acc_norm_stderr\": 0.027045746573534327\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4236453201970443,\n \"acc_stderr\": 0.03476725747649037,\n\
\ \"acc_norm\": 0.4236453201970443,\n \"acc_norm_stderr\": 0.03476725747649037\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6787878787878788,\n \"acc_stderr\": 0.036462049632538115,\n\
\ \"acc_norm\": 0.6787878787878788,\n \"acc_norm_stderr\": 0.036462049632538115\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7070707070707071,\n \"acc_stderr\": 0.032424979581788166,\n \"\
acc_norm\": 0.7070707070707071,\n \"acc_norm_stderr\": 0.032424979581788166\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8031088082901554,\n \"acc_stderr\": 0.028697873971860688,\n\
\ \"acc_norm\": 0.8031088082901554,\n \"acc_norm_stderr\": 0.028697873971860688\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5025641025641026,\n \"acc_stderr\": 0.025350672979412195,\n\
\ \"acc_norm\": 0.5025641025641026,\n \"acc_norm_stderr\": 0.025350672979412195\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028604,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028604\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.03196876989195778,\n \
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.03196876989195778\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.039439666991836285,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.039439666991836285\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7559633027522936,\n \"acc_stderr\": 0.018415286351416402,\n \"\
acc_norm\": 0.7559633027522936,\n \"acc_norm_stderr\": 0.018415286351416402\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.47685185185185186,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.47685185185185186,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967408,\n \"\
acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967408\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7257383966244726,\n \"acc_stderr\": 0.029041333510598025,\n \
\ \"acc_norm\": 0.7257383966244726,\n \"acc_norm_stderr\": 0.029041333510598025\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6233183856502242,\n\
\ \"acc_stderr\": 0.03252113489929188,\n \"acc_norm\": 0.6233183856502242,\n\
\ \"acc_norm_stderr\": 0.03252113489929188\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5954198473282443,\n \"acc_stderr\": 0.043046937953806645,\n\
\ \"acc_norm\": 0.5954198473282443,\n \"acc_norm_stderr\": 0.043046937953806645\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.039418975265163025,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.039418975265163025\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n\
\ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n\
\ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6687116564417178,\n \"acc_stderr\": 0.03697983910025588,\n\
\ \"acc_norm\": 0.6687116564417178,\n \"acc_norm_stderr\": 0.03697983910025588\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.042878587513404565,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.042878587513404565\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n\
\ \"acc_stderr\": 0.02624677294689048,\n \"acc_norm\": 0.7991452991452992,\n\
\ \"acc_norm_stderr\": 0.02624677294689048\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7547892720306514,\n\
\ \"acc_stderr\": 0.015384352284543943,\n \"acc_norm\": 0.7547892720306514,\n\
\ \"acc_norm_stderr\": 0.015384352284543943\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6213872832369942,\n \"acc_stderr\": 0.02611374936131034,\n\
\ \"acc_norm\": 0.6213872832369942,\n \"acc_norm_stderr\": 0.02611374936131034\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2871508379888268,\n\
\ \"acc_stderr\": 0.01513160884996375,\n \"acc_norm\": 0.2871508379888268,\n\
\ \"acc_norm_stderr\": 0.01513160884996375\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5718954248366013,\n \"acc_stderr\": 0.028332397483664278,\n\
\ \"acc_norm\": 0.5718954248366013,\n \"acc_norm_stderr\": 0.028332397483664278\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6495176848874598,\n\
\ \"acc_stderr\": 0.027098652621301754,\n \"acc_norm\": 0.6495176848874598,\n\
\ \"acc_norm_stderr\": 0.027098652621301754\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.02686949074481525,\n\
\ \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.02686949074481525\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.44680851063829785,\n \"acc_stderr\": 0.029658235097666907,\n \
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.029658235097666907\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4335071707953064,\n\
\ \"acc_stderr\": 0.012656810383983967,\n \"acc_norm\": 0.4335071707953064,\n\
\ \"acc_norm_stderr\": 0.012656810383983967\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5330882352941176,\n \"acc_stderr\": 0.030306257722468307,\n\
\ \"acc_norm\": 0.5330882352941176,\n \"acc_norm_stderr\": 0.030306257722468307\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5751633986928104,\n \"acc_stderr\": 0.01999797303545833,\n \
\ \"acc_norm\": 0.5751633986928104,\n \"acc_norm_stderr\": 0.01999797303545833\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n\
\ \"acc_stderr\": 0.04673752333670239,\n \"acc_norm\": 0.6090909090909091,\n\
\ \"acc_norm_stderr\": 0.04673752333670239\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5387755102040817,\n \"acc_stderr\": 0.031912820526692774,\n\
\ \"acc_norm\": 0.5387755102040817,\n \"acc_norm_stderr\": 0.031912820526692774\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7064676616915423,\n\
\ \"acc_stderr\": 0.03220024104534205,\n \"acc_norm\": 0.7064676616915423,\n\
\ \"acc_norm_stderr\": 0.03220024104534205\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n\
\ \"acc_stderr\": 0.03828401115079023,\n \"acc_norm\": 0.40963855421686746,\n\
\ \"acc_norm_stderr\": 0.03828401115079023\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.032744852119469564,\n\
\ \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.032744852119469564\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2717258261933905,\n\
\ \"mc1_stderr\": 0.01557284045287583,\n \"mc2\": 0.402271195410076,\n\
\ \"mc2_stderr\": 0.014079017857356345\n }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-gate_up_down
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|arc:challenge|25_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hellaswag|10_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-25-01.199069.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-25-01.199069.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T11-25-01.199069.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T11-25-01.199069.parquet'
- config_name: results
data_files:
- split: 2023_10_10T11_25_01.199069
path:
- results_2023-10-10T11-25-01.199069.parquet
- split: latest
path:
- results_2023-10-10T11-25-01.199069.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-gate_up_down
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-gate_up_down
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-gate_up_down) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r8-gate_up_down",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T11:25:01.199069](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r8-gate_up_down/blob/main/results_2023-10-10T11-25-01.199069.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5495005982268492,
"acc_stderr": 0.03452729598059282,
"acc_norm": 0.5538011481690731,
"acc_norm_stderr": 0.034507446304033906,
"mc1": 0.2717258261933905,
"mc1_stderr": 0.01557284045287583,
"mc2": 0.402271195410076,
"mc2_stderr": 0.014079017857356345
},
"harness|arc:challenge|25": {
"acc": 0.5281569965870307,
"acc_stderr": 0.014588204105102205,
"acc_norm": 0.5716723549488054,
"acc_norm_stderr": 0.014460496367599015
},
"harness|hellaswag|10": {
"acc": 0.6113324039036049,
"acc_stderr": 0.004864513262194313,
"acc_norm": 0.8215494921330412,
"acc_norm_stderr": 0.0038210900827217115
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5657894736842105,
"acc_stderr": 0.04033565667848319,
"acc_norm": 0.5657894736842105,
"acc_norm_stderr": 0.04033565667848319
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6,
"acc_stderr": 0.030151134457776285,
"acc_norm": 0.6,
"acc_norm_stderr": 0.030151134457776285
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6180555555555556,
"acc_stderr": 0.040629907841466674,
"acc_norm": 0.6180555555555556,
"acc_norm_stderr": 0.040629907841466674
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5028901734104047,
"acc_stderr": 0.038124005659748335,
"acc_norm": 0.5028901734104047,
"acc_norm_stderr": 0.038124005659748335
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383888,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383888
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322004,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322004
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.45517241379310347,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.45517241379310347,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3306878306878307,
"acc_stderr": 0.024229965298425082,
"acc_norm": 0.3306878306878307,
"acc_norm_stderr": 0.024229965298425082
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.042639068927951336,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.042639068927951336
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6548387096774193,
"acc_stderr": 0.027045746573534327,
"acc_norm": 0.6548387096774193,
"acc_norm_stderr": 0.027045746573534327
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4236453201970443,
"acc_stderr": 0.03476725747649037,
"acc_norm": 0.4236453201970443,
"acc_norm_stderr": 0.03476725747649037
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6787878787878788,
"acc_stderr": 0.036462049632538115,
"acc_norm": 0.6787878787878788,
"acc_norm_stderr": 0.036462049632538115
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7070707070707071,
"acc_stderr": 0.032424979581788166,
"acc_norm": 0.7070707070707071,
"acc_norm_stderr": 0.032424979581788166
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8031088082901554,
"acc_stderr": 0.028697873971860688,
"acc_norm": 0.8031088082901554,
"acc_norm_stderr": 0.028697873971860688
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5025641025641026,
"acc_stderr": 0.025350672979412195,
"acc_norm": 0.5025641025641026,
"acc_norm_stderr": 0.025350672979412195
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028604,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028604
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.03196876989195778,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.03196876989195778
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.039439666991836285,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.039439666991836285
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7559633027522936,
"acc_stderr": 0.018415286351416402,
"acc_norm": 0.7559633027522936,
"acc_norm_stderr": 0.018415286351416402
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.47685185185185186,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.47685185185185186,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.02910225438967408,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.02910225438967408
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7257383966244726,
"acc_stderr": 0.029041333510598025,
"acc_norm": 0.7257383966244726,
"acc_norm_stderr": 0.029041333510598025
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6233183856502242,
"acc_stderr": 0.03252113489929188,
"acc_norm": 0.6233183856502242,
"acc_norm_stderr": 0.03252113489929188
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5954198473282443,
"acc_stderr": 0.043046937953806645,
"acc_norm": 0.5954198473282443,
"acc_norm_stderr": 0.043046937953806645
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6687116564417178,
"acc_stderr": 0.03697983910025588,
"acc_norm": 0.6687116564417178,
"acc_norm_stderr": 0.03697983910025588
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.042878587513404565,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.042878587513404565
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.02624677294689048,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.02624677294689048
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7547892720306514,
"acc_stderr": 0.015384352284543943,
"acc_norm": 0.7547892720306514,
"acc_norm_stderr": 0.015384352284543943
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6213872832369942,
"acc_stderr": 0.02611374936131034,
"acc_norm": 0.6213872832369942,
"acc_norm_stderr": 0.02611374936131034
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2871508379888268,
"acc_stderr": 0.01513160884996375,
"acc_norm": 0.2871508379888268,
"acc_norm_stderr": 0.01513160884996375
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5718954248366013,
"acc_stderr": 0.028332397483664278,
"acc_norm": 0.5718954248366013,
"acc_norm_stderr": 0.028332397483664278
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6495176848874598,
"acc_stderr": 0.027098652621301754,
"acc_norm": 0.6495176848874598,
"acc_norm_stderr": 0.027098652621301754
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.02686949074481525,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.02686949074481525
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.029658235097666907,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.029658235097666907
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4335071707953064,
"acc_stderr": 0.012656810383983967,
"acc_norm": 0.4335071707953064,
"acc_norm_stderr": 0.012656810383983967
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5330882352941176,
"acc_stderr": 0.030306257722468307,
"acc_norm": 0.5330882352941176,
"acc_norm_stderr": 0.030306257722468307
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5751633986928104,
"acc_stderr": 0.01999797303545833,
"acc_norm": 0.5751633986928104,
"acc_norm_stderr": 0.01999797303545833
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.04673752333670239,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.04673752333670239
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5387755102040817,
"acc_stderr": 0.031912820526692774,
"acc_norm": 0.5387755102040817,
"acc_norm_stderr": 0.031912820526692774
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7064676616915423,
"acc_stderr": 0.03220024104534205,
"acc_norm": 0.7064676616915423,
"acc_norm_stderr": 0.03220024104534205
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.03828401115079023,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.03828401115079023
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.032744852119469564,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.032744852119469564
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2717258261933905,
"mc1_stderr": 0.01557284045287583,
"mc2": 0.402271195410076,
"mc2_stderr": 0.014079017857356345
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_mncai__Mistral-7B-openplatypus-1k | 2023-10-10T11:27:58.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of mncai/Mistral-7B-openplatypus-1k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mncai/Mistral-7B-openplatypus-1k](https://huggingface.co/mncai/Mistral-7B-openplatypus-1k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mncai__Mistral-7B-openplatypus-1k\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T11:26:36.133476](https://huggingface.co/datasets/open-llm-leaderboard/details_mncai__Mistral-7B-openplatypus-1k/blob/main/results_2023-10-10T11-26-36.133476.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5987327353548585,\n\
\ \"acc_stderr\": 0.033868715674081146,\n \"acc_norm\": 0.6025940270406707,\n\
\ \"acc_norm_stderr\": 0.033846695262711564,\n \"mc1\": 0.3378212974296206,\n\
\ \"mc1_stderr\": 0.01655716732251688,\n \"mc2\": 0.498569630806063,\n\
\ \"mc2_stderr\": 0.015133442762891728\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5674061433447098,\n \"acc_stderr\": 0.014478005694182528,\n\
\ \"acc_norm\": 0.6015358361774744,\n \"acc_norm_stderr\": 0.014306946052735565\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6487751443935471,\n\
\ \"acc_stderr\": 0.004763774981834674,\n \"acc_norm\": 0.8424616610237005,\n\
\ \"acc_norm_stderr\": 0.0036356303524759065\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316092,\n\
\ \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316092\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6415094339622641,\n \"acc_stderr\": 0.02951470358398177,\n\
\ \"acc_norm\": 0.6415094339622641,\n \"acc_norm_stderr\": 0.02951470358398177\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6736111111111112,\n\
\ \"acc_stderr\": 0.03921067198982266,\n \"acc_norm\": 0.6736111111111112,\n\
\ \"acc_norm_stderr\": 0.03921067198982266\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5838150289017341,\n\
\ \"acc_stderr\": 0.03758517775404948,\n \"acc_norm\": 0.5838150289017341,\n\
\ \"acc_norm_stderr\": 0.03758517775404948\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4978723404255319,\n \"acc_stderr\": 0.03268572658667492,\n\
\ \"acc_norm\": 0.4978723404255319,\n \"acc_norm_stderr\": 0.03268572658667492\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42063492063492064,\n \"acc_stderr\": 0.025424835086923992,\n \"\
acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086923992\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7161290322580646,\n\
\ \"acc_stderr\": 0.025649381063029265,\n \"acc_norm\": 0.7161290322580646,\n\
\ \"acc_norm_stderr\": 0.025649381063029265\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6848484848484848,\n \"acc_stderr\": 0.0362773057502241,\n\
\ \"acc_norm\": 0.6848484848484848,\n \"acc_norm_stderr\": 0.0362773057502241\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7121212121212122,\n \"acc_stderr\": 0.03225883512300993,\n \"\
acc_norm\": 0.7121212121212122,\n \"acc_norm_stderr\": 0.03225883512300993\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.02840895362624528,\n\
\ \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.02840895362624528\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5846153846153846,\n \"acc_stderr\": 0.02498535492310234,\n \
\ \"acc_norm\": 0.5846153846153846,\n \"acc_norm_stderr\": 0.02498535492310234\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3074074074074074,\n \"acc_stderr\": 0.02813325257881564,\n \
\ \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.02813325257881564\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5966386554621849,\n \"acc_stderr\": 0.03186608121408832,\n \
\ \"acc_norm\": 0.5966386554621849,\n \"acc_norm_stderr\": 0.03186608121408832\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7761467889908257,\n \"acc_stderr\": 0.01787121776779022,\n \"\
acc_norm\": 0.7761467889908257,\n \"acc_norm_stderr\": 0.01787121776779022\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.36574074074074076,\n \"acc_stderr\": 0.03284738857647207,\n \"\
acc_norm\": 0.36574074074074076,\n \"acc_norm_stderr\": 0.03284738857647207\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849313,\n \"\
acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849313\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7426160337552743,\n \"acc_stderr\": 0.0284588209914603,\n \
\ \"acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.0284588209914603\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n\
\ \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.6457399103139013,\n\
\ \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6946564885496184,\n \"acc_stderr\": 0.040393149787245605,\n\
\ \"acc_norm\": 0.6946564885496184,\n \"acc_norm_stderr\": 0.040393149787245605\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.0384985609879409,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.0384985609879409\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.03675668832233188,\n\
\ \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.03675668832233188\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.024414947304543674,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.024414947304543674\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7790549169859514,\n\
\ \"acc_stderr\": 0.014836205167333569,\n \"acc_norm\": 0.7790549169859514,\n\
\ \"acc_norm_stderr\": 0.014836205167333569\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6734104046242775,\n \"acc_stderr\": 0.025248264774242836,\n\
\ \"acc_norm\": 0.6734104046242775,\n \"acc_norm_stderr\": 0.025248264774242836\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37094972067039106,\n\
\ \"acc_stderr\": 0.016155910721341767,\n \"acc_norm\": 0.37094972067039106,\n\
\ \"acc_norm_stderr\": 0.016155910721341767\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.026857294663281406,\n\
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.026857294663281406\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n\
\ \"acc_stderr\": 0.026236965881153266,\n \"acc_norm\": 0.6913183279742765,\n\
\ \"acc_norm_stderr\": 0.026236965881153266\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6697530864197531,\n \"acc_stderr\": 0.026168298456732846,\n\
\ \"acc_norm\": 0.6697530864197531,\n \"acc_norm_stderr\": 0.026168298456732846\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.425531914893617,\n \"acc_stderr\": 0.029494827600144373,\n \
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.029494827600144373\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4556714471968709,\n\
\ \"acc_stderr\": 0.012719949543032197,\n \"acc_norm\": 0.4556714471968709,\n\
\ \"acc_norm_stderr\": 0.012719949543032197\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6580882352941176,\n \"acc_stderr\": 0.02881472242225418,\n\
\ \"acc_norm\": 0.6580882352941176,\n \"acc_norm_stderr\": 0.02881472242225418\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6160130718954249,\n \"acc_stderr\": 0.019675808135281508,\n \
\ \"acc_norm\": 0.6160130718954249,\n \"acc_norm_stderr\": 0.019675808135281508\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6612244897959184,\n \"acc_stderr\": 0.030299506562154185,\n\
\ \"acc_norm\": 0.6612244897959184,\n \"acc_norm_stderr\": 0.030299506562154185\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n\
\ \"acc_stderr\": 0.027403859410786848,\n \"acc_norm\": 0.8159203980099502,\n\
\ \"acc_norm_stderr\": 0.027403859410786848\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.031581495393387324,\n\
\ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.031581495393387324\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3378212974296206,\n\
\ \"mc1_stderr\": 0.01655716732251688,\n \"mc2\": 0.498569630806063,\n\
\ \"mc2_stderr\": 0.015133442762891728\n }\n}\n```"
repo_url: https://huggingface.co/mncai/Mistral-7B-openplatypus-1k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|arc:challenge|25_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hellaswag|10_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-26-36.133476.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-26-36.133476.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T11-26-36.133476.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T11-26-36.133476.parquet'
- config_name: results
data_files:
- split: 2023_10_10T11_26_36.133476
path:
- results_2023-10-10T11-26-36.133476.parquet
- split: latest
path:
- results_2023-10-10T11-26-36.133476.parquet
---
# Dataset Card for Evaluation run of mncai/Mistral-7B-openplatypus-1k
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/mncai/Mistral-7B-openplatypus-1k
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [mncai/Mistral-7B-openplatypus-1k](https://huggingface.co/mncai/Mistral-7B-openplatypus-1k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mncai__Mistral-7B-openplatypus-1k",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T11:26:36.133476](https://huggingface.co/datasets/open-llm-leaderboard/details_mncai__Mistral-7B-openplatypus-1k/blob/main/results_2023-10-10T11-26-36.133476.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5987327353548585,
"acc_stderr": 0.033868715674081146,
"acc_norm": 0.6025940270406707,
"acc_norm_stderr": 0.033846695262711564,
"mc1": 0.3378212974296206,
"mc1_stderr": 0.01655716732251688,
"mc2": 0.498569630806063,
"mc2_stderr": 0.015133442762891728
},
"harness|arc:challenge|25": {
"acc": 0.5674061433447098,
"acc_stderr": 0.014478005694182528,
"acc_norm": 0.6015358361774744,
"acc_norm_stderr": 0.014306946052735565
},
"harness|hellaswag|10": {
"acc": 0.6487751443935471,
"acc_stderr": 0.004763774981834674,
"acc_norm": 0.8424616610237005,
"acc_norm_stderr": 0.0036356303524759065
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316092,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316092
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6415094339622641,
"acc_stderr": 0.02951470358398177,
"acc_norm": 0.6415094339622641,
"acc_norm_stderr": 0.02951470358398177
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6736111111111112,
"acc_stderr": 0.03921067198982266,
"acc_norm": 0.6736111111111112,
"acc_norm_stderr": 0.03921067198982266
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5838150289017341,
"acc_stderr": 0.03758517775404948,
"acc_norm": 0.5838150289017341,
"acc_norm_stderr": 0.03758517775404948
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4978723404255319,
"acc_stderr": 0.03268572658667492,
"acc_norm": 0.4978723404255319,
"acc_norm_stderr": 0.03268572658667492
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04697085136647863,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04697085136647863
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086923992,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086923992
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7161290322580646,
"acc_stderr": 0.025649381063029265,
"acc_norm": 0.7161290322580646,
"acc_norm_stderr": 0.025649381063029265
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6848484848484848,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.6848484848484848,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7121212121212122,
"acc_stderr": 0.03225883512300993,
"acc_norm": 0.7121212121212122,
"acc_norm_stderr": 0.03225883512300993
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8082901554404145,
"acc_stderr": 0.02840895362624528,
"acc_norm": 0.8082901554404145,
"acc_norm_stderr": 0.02840895362624528
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5846153846153846,
"acc_stderr": 0.02498535492310234,
"acc_norm": 0.5846153846153846,
"acc_norm_stderr": 0.02498535492310234
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.02813325257881564,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.02813325257881564
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5966386554621849,
"acc_stderr": 0.03186608121408832,
"acc_norm": 0.5966386554621849,
"acc_norm_stderr": 0.03186608121408832
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7761467889908257,
"acc_stderr": 0.01787121776779022,
"acc_norm": 0.7761467889908257,
"acc_norm_stderr": 0.01787121776779022
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.36574074074074076,
"acc_stderr": 0.03284738857647207,
"acc_norm": 0.36574074074074076,
"acc_norm_stderr": 0.03284738857647207
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849313,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849313
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7426160337552743,
"acc_stderr": 0.0284588209914603,
"acc_norm": 0.7426160337552743,
"acc_norm_stderr": 0.0284588209914603
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.032100621541349864,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.032100621541349864
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6946564885496184,
"acc_stderr": 0.040393149787245605,
"acc_norm": 0.6946564885496184,
"acc_norm_stderr": 0.040393149787245605
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.0384985609879409,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.0384985609879409
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.03675668832233188,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.03675668832233188
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.024414947304543674,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.024414947304543674
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7790549169859514,
"acc_stderr": 0.014836205167333569,
"acc_norm": 0.7790549169859514,
"acc_norm_stderr": 0.014836205167333569
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6734104046242775,
"acc_stderr": 0.025248264774242836,
"acc_norm": 0.6734104046242775,
"acc_norm_stderr": 0.025248264774242836
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37094972067039106,
"acc_stderr": 0.016155910721341767,
"acc_norm": 0.37094972067039106,
"acc_norm_stderr": 0.016155910721341767
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.026857294663281406,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.026857294663281406
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.026236965881153266,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.026236965881153266
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6697530864197531,
"acc_stderr": 0.026168298456732846,
"acc_norm": 0.6697530864197531,
"acc_norm_stderr": 0.026168298456732846
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.029494827600144373,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.029494827600144373
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4556714471968709,
"acc_stderr": 0.012719949543032197,
"acc_norm": 0.4556714471968709,
"acc_norm_stderr": 0.012719949543032197
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6580882352941176,
"acc_stderr": 0.02881472242225418,
"acc_norm": 0.6580882352941176,
"acc_norm_stderr": 0.02881472242225418
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6160130718954249,
"acc_stderr": 0.019675808135281508,
"acc_norm": 0.6160130718954249,
"acc_norm_stderr": 0.019675808135281508
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6612244897959184,
"acc_stderr": 0.030299506562154185,
"acc_norm": 0.6612244897959184,
"acc_norm_stderr": 0.030299506562154185
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786848,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786848
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.031581495393387324,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.031581495393387324
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3378212974296206,
"mc1_stderr": 0.01655716732251688,
"mc2": 0.498569630806063,
"mc2_stderr": 0.015133442762891728
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r4-q_k_v_o_gate_up_down | 2023-10-10T11:33:30.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-q_k_v_o_gate_up_down
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-q_k_v_o_gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-q_k_v_o_gate_up_down)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r4-q_k_v_o_gate_up_down\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T11:32:04.979499](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r4-q_k_v_o_gate_up_down/blob/main/results_2023-10-10T11-32-04.979499.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5384833665980309,\n\
\ \"acc_stderr\": 0.034724029174590776,\n \"acc_norm\": 0.5427417208806691,\n\
\ \"acc_norm_stderr\": 0.034705620291695424,\n \"mc1\": 0.27906976744186046,\n\
\ \"mc1_stderr\": 0.015702107090627908,\n \"mc2\": 0.40253751912204466,\n\
\ \"mc2_stderr\": 0.013956667981718091\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.515358361774744,\n \"acc_stderr\": 0.01460449612939491,\n\
\ \"acc_norm\": 0.5588737201365188,\n \"acc_norm_stderr\": 0.014509747749064663\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6060545708026289,\n\
\ \"acc_stderr\": 0.004876243842318607,\n \"acc_norm\": 0.8137821151165107,\n\
\ \"acc_norm_stderr\": 0.003884868131822895\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411021,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411021\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5131578947368421,\n \"acc_stderr\": 0.04067533136309173,\n\
\ \"acc_norm\": 0.5131578947368421,\n \"acc_norm_stderr\": 0.04067533136309173\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n\
\ \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5811320754716981,\n \"acc_stderr\": 0.030365050829115208,\n\
\ \"acc_norm\": 0.5811320754716981,\n \"acc_norm_stderr\": 0.030365050829115208\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5486111111111112,\n\
\ \"acc_stderr\": 0.041614023984032786,\n \"acc_norm\": 0.5486111111111112,\n\
\ \"acc_norm_stderr\": 0.041614023984032786\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4913294797687861,\n\
\ \"acc_stderr\": 0.03811890988940412,\n \"acc_norm\": 0.4913294797687861,\n\
\ \"acc_norm_stderr\": 0.03811890988940412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n\
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4425531914893617,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.4425531914893617,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4413793103448276,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.4413793103448276,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.328042328042328,\n \"acc_stderr\": 0.02418049716437689,\n \"acc_norm\"\
: 0.328042328042328,\n \"acc_norm_stderr\": 0.02418049716437689\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04216370213557835,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04216370213557835\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6516129032258065,\n\
\ \"acc_stderr\": 0.027104826328100944,\n \"acc_norm\": 0.6516129032258065,\n\
\ \"acc_norm_stderr\": 0.027104826328100944\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.46798029556650245,\n \"acc_stderr\": 0.035107665979592154,\n\
\ \"acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.035107665979592154\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.696969696969697,\n \"acc_stderr\": 0.03588624800091706,\n\
\ \"acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.03588624800091706\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6717171717171717,\n \"acc_stderr\": 0.033456784227567746,\n \"\
acc_norm\": 0.6717171717171717,\n \"acc_norm_stderr\": 0.033456784227567746\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7927461139896373,\n \"acc_stderr\": 0.02925282329180363,\n\
\ \"acc_norm\": 0.7927461139896373,\n \"acc_norm_stderr\": 0.02925282329180363\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4846153846153846,\n \"acc_stderr\": 0.02533900301010651,\n \
\ \"acc_norm\": 0.4846153846153846,\n \"acc_norm_stderr\": 0.02533900301010651\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.02803792996911498,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.02803792996911498\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6008403361344538,\n \"acc_stderr\": 0.031811100324139266,\n\
\ \"acc_norm\": 0.6008403361344538,\n \"acc_norm_stderr\": 0.031811100324139266\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7302752293577982,\n \"acc_stderr\": 0.01902848671111544,\n \"\
acc_norm\": 0.7302752293577982,\n \"acc_norm_stderr\": 0.01902848671111544\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4212962962962963,\n \"acc_stderr\": 0.03367462138896078,\n \"\
acc_norm\": 0.4212962962962963,\n \"acc_norm_stderr\": 0.03367462138896078\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591361,\n \"\
acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591361\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6708860759493671,\n \"acc_stderr\": 0.03058732629470236,\n \
\ \"acc_norm\": 0.6708860759493671,\n \"acc_norm_stderr\": 0.03058732629470236\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n\
\ \"acc_stderr\": 0.03210062154134986,\n \"acc_norm\": 0.6457399103139013,\n\
\ \"acc_norm_stderr\": 0.03210062154134986\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5648854961832062,\n \"acc_stderr\": 0.04348208051644858,\n\
\ \"acc_norm\": 0.5648854961832062,\n \"acc_norm_stderr\": 0.04348208051644858\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094632,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094632\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.558282208588957,\n \"acc_stderr\": 0.03901591825836184,\n\
\ \"acc_norm\": 0.558282208588957,\n \"acc_norm_stderr\": 0.03901591825836184\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7735042735042735,\n\
\ \"acc_stderr\": 0.02742100729539292,\n \"acc_norm\": 0.7735042735042735,\n\
\ \"acc_norm_stderr\": 0.02742100729539292\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7381864623243933,\n\
\ \"acc_stderr\": 0.01572083867844526,\n \"acc_norm\": 0.7381864623243933,\n\
\ \"acc_norm_stderr\": 0.01572083867844526\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.026074314851657083,\n\
\ \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.026074314851657083\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3340782122905028,\n\
\ \"acc_stderr\": 0.01577491142238162,\n \"acc_norm\": 0.3340782122905028,\n\
\ \"acc_norm_stderr\": 0.01577491142238162\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5849673202614379,\n \"acc_stderr\": 0.028213504177824093,\n\
\ \"acc_norm\": 0.5849673202614379,\n \"acc_norm_stderr\": 0.028213504177824093\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6430868167202572,\n\
\ \"acc_stderr\": 0.02721042037593402,\n \"acc_norm\": 0.6430868167202572,\n\
\ \"acc_norm_stderr\": 0.02721042037593402\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5802469135802469,\n \"acc_stderr\": 0.02746009955700513,\n\
\ \"acc_norm\": 0.5802469135802469,\n \"acc_norm_stderr\": 0.02746009955700513\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4397163120567376,\n \"acc_stderr\": 0.02960991207559411,\n \
\ \"acc_norm\": 0.4397163120567376,\n \"acc_norm_stderr\": 0.02960991207559411\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4217731421121252,\n\
\ \"acc_stderr\": 0.01261297436939098,\n \"acc_norm\": 0.4217731421121252,\n\
\ \"acc_norm_stderr\": 0.01261297436939098\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5625,\n \"acc_stderr\": 0.030134614954403924,\n \
\ \"acc_norm\": 0.5625,\n \"acc_norm_stderr\": 0.030134614954403924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5506535947712419,\n \"acc_stderr\": 0.02012376652802727,\n \
\ \"acc_norm\": 0.5506535947712419,\n \"acc_norm_stderr\": 0.02012376652802727\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5545454545454546,\n\
\ \"acc_stderr\": 0.047605488214603246,\n \"acc_norm\": 0.5545454545454546,\n\
\ \"acc_norm_stderr\": 0.047605488214603246\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5102040816326531,\n \"acc_stderr\": 0.03200255347893783,\n\
\ \"acc_norm\": 0.5102040816326531,\n \"acc_norm_stderr\": 0.03200255347893783\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6865671641791045,\n\
\ \"acc_stderr\": 0.03280188205348643,\n \"acc_norm\": 0.6865671641791045,\n\
\ \"acc_norm_stderr\": 0.03280188205348643\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \
\ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3855421686746988,\n\
\ \"acc_stderr\": 0.0378913442461155,\n \"acc_norm\": 0.3855421686746988,\n\
\ \"acc_norm_stderr\": 0.0378913442461155\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7426900584795322,\n \"acc_stderr\": 0.03352799844161865,\n\
\ \"acc_norm\": 0.7426900584795322,\n \"acc_norm_stderr\": 0.03352799844161865\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.27906976744186046,\n\
\ \"mc1_stderr\": 0.015702107090627908,\n \"mc2\": 0.40253751912204466,\n\
\ \"mc2_stderr\": 0.013956667981718091\n }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-q_k_v_o_gate_up_down
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|arc:challenge|25_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hellaswag|10_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-32-04.979499.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-32-04.979499.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T11-32-04.979499.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T11-32-04.979499.parquet'
- config_name: results
data_files:
- split: 2023_10_10T11_32_04.979499
path:
- results_2023-10-10T11-32-04.979499.parquet
- split: latest
path:
- results_2023-10-10T11-32-04.979499.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-q_k_v_o_gate_up_down
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-q_k_v_o_gate_up_down
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-q_k_v_o_gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-q_k_v_o_gate_up_down) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r4-q_k_v_o_gate_up_down",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T11:32:04.979499](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r4-q_k_v_o_gate_up_down/blob/main/results_2023-10-10T11-32-04.979499.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5384833665980309,
"acc_stderr": 0.034724029174590776,
"acc_norm": 0.5427417208806691,
"acc_norm_stderr": 0.034705620291695424,
"mc1": 0.27906976744186046,
"mc1_stderr": 0.015702107090627908,
"mc2": 0.40253751912204466,
"mc2_stderr": 0.013956667981718091
},
"harness|arc:challenge|25": {
"acc": 0.515358361774744,
"acc_stderr": 0.01460449612939491,
"acc_norm": 0.5588737201365188,
"acc_norm_stderr": 0.014509747749064663
},
"harness|hellaswag|10": {
"acc": 0.6060545708026289,
"acc_stderr": 0.004876243842318607,
"acc_norm": 0.8137821151165107,
"acc_norm_stderr": 0.003884868131822895
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411021,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411021
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5111111111111111,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.5111111111111111,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5131578947368421,
"acc_stderr": 0.04067533136309173,
"acc_norm": 0.5131578947368421,
"acc_norm_stderr": 0.04067533136309173
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5811320754716981,
"acc_stderr": 0.030365050829115208,
"acc_norm": 0.5811320754716981,
"acc_norm_stderr": 0.030365050829115208
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5486111111111112,
"acc_stderr": 0.041614023984032786,
"acc_norm": 0.5486111111111112,
"acc_norm_stderr": 0.041614023984032786
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4913294797687861,
"acc_stderr": 0.03811890988940412,
"acc_norm": 0.4913294797687861,
"acc_norm_stderr": 0.03811890988940412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201942,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201942
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4425531914893617,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.4425531914893617,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4413793103448276,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.4413793103448276,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.328042328042328,
"acc_stderr": 0.02418049716437689,
"acc_norm": 0.328042328042328,
"acc_norm_stderr": 0.02418049716437689
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04216370213557835,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04216370213557835
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6516129032258065,
"acc_stderr": 0.027104826328100944,
"acc_norm": 0.6516129032258065,
"acc_norm_stderr": 0.027104826328100944
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.46798029556650245,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.46798029556650245,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.03588624800091706,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.03588624800091706
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6717171717171717,
"acc_stderr": 0.033456784227567746,
"acc_norm": 0.6717171717171717,
"acc_norm_stderr": 0.033456784227567746
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7927461139896373,
"acc_stderr": 0.02925282329180363,
"acc_norm": 0.7927461139896373,
"acc_norm_stderr": 0.02925282329180363
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4846153846153846,
"acc_stderr": 0.02533900301010651,
"acc_norm": 0.4846153846153846,
"acc_norm_stderr": 0.02533900301010651
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.02803792996911498,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.02803792996911498
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6008403361344538,
"acc_stderr": 0.031811100324139266,
"acc_norm": 0.6008403361344538,
"acc_norm_stderr": 0.031811100324139266
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7302752293577982,
"acc_stderr": 0.01902848671111544,
"acc_norm": 0.7302752293577982,
"acc_norm_stderr": 0.01902848671111544
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4212962962962963,
"acc_stderr": 0.03367462138896078,
"acc_norm": 0.4212962962962963,
"acc_norm_stderr": 0.03367462138896078
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6708860759493671,
"acc_stderr": 0.03058732629470236,
"acc_norm": 0.6708860759493671,
"acc_norm_stderr": 0.03058732629470236
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.03210062154134986,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.03210062154134986
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5648854961832062,
"acc_stderr": 0.04348208051644858,
"acc_norm": 0.5648854961832062,
"acc_norm_stderr": 0.04348208051644858
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094632,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094632
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.558282208588957,
"acc_stderr": 0.03901591825836184,
"acc_norm": 0.558282208588957,
"acc_norm_stderr": 0.03901591825836184
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7735042735042735,
"acc_stderr": 0.02742100729539292,
"acc_norm": 0.7735042735042735,
"acc_norm_stderr": 0.02742100729539292
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7381864623243933,
"acc_stderr": 0.01572083867844526,
"acc_norm": 0.7381864623243933,
"acc_norm_stderr": 0.01572083867844526
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.026074314851657083,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.026074314851657083
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3340782122905028,
"acc_stderr": 0.01577491142238162,
"acc_norm": 0.3340782122905028,
"acc_norm_stderr": 0.01577491142238162
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5849673202614379,
"acc_stderr": 0.028213504177824093,
"acc_norm": 0.5849673202614379,
"acc_norm_stderr": 0.028213504177824093
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6430868167202572,
"acc_stderr": 0.02721042037593402,
"acc_norm": 0.6430868167202572,
"acc_norm_stderr": 0.02721042037593402
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5802469135802469,
"acc_stderr": 0.02746009955700513,
"acc_norm": 0.5802469135802469,
"acc_norm_stderr": 0.02746009955700513
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4397163120567376,
"acc_stderr": 0.02960991207559411,
"acc_norm": 0.4397163120567376,
"acc_norm_stderr": 0.02960991207559411
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4217731421121252,
"acc_stderr": 0.01261297436939098,
"acc_norm": 0.4217731421121252,
"acc_norm_stderr": 0.01261297436939098
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5625,
"acc_stderr": 0.030134614954403924,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.030134614954403924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5506535947712419,
"acc_stderr": 0.02012376652802727,
"acc_norm": 0.5506535947712419,
"acc_norm_stderr": 0.02012376652802727
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5545454545454546,
"acc_stderr": 0.047605488214603246,
"acc_norm": 0.5545454545454546,
"acc_norm_stderr": 0.047605488214603246
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5102040816326531,
"acc_stderr": 0.03200255347893783,
"acc_norm": 0.5102040816326531,
"acc_norm_stderr": 0.03200255347893783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6865671641791045,
"acc_stderr": 0.03280188205348643,
"acc_norm": 0.6865671641791045,
"acc_norm_stderr": 0.03280188205348643
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3855421686746988,
"acc_stderr": 0.0378913442461155,
"acc_norm": 0.3855421686746988,
"acc_norm_stderr": 0.0378913442461155
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7426900584795322,
"acc_stderr": 0.03352799844161865,
"acc_norm": 0.7426900584795322,
"acc_norm_stderr": 0.03352799844161865
},
"harness|truthfulqa:mc|0": {
"mc1": 0.27906976744186046,
"mc1_stderr": 0.015702107090627908,
"mc2": 0.40253751912204466,
"mc2_stderr": 0.013956667981718091
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
skaltenp/textworld_cooking_augmented | 2023-10-10T13:06:02.000Z | [
"region:us"
] | skaltenp | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: train_demo
path: data/train_demo-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
dataset_info:
features:
- name: z8_path
dtype: string
- name: id
dtype: string
- name: ulx_path
dtype: string
- name: score
dtype: int64
- name: demonstration
sequence:
sequence: string
- name: moves
dtype: int64
- name: json_path
dtype: string
splits:
- name: train
num_bytes: 111951142
num_examples: 16272
- name: train_demo
num_bytes: 284608
num_examples: 856
- name: valid
num_bytes: 5418846
num_examples: 872
- name: test
num_bytes: 5613370
num_examples: 868
download_size: 21851355
dataset_size: 123267966
---
# Dataset Card for "textworld_cooking_augmented"
This dataset is a synthetically generated dataset based on the [Textworld Cooking Game](https://github.com/microsoft/TextWorld/tree/main/textworld/challenges/tw_cooking).
As the GitHub-Repository enables the generation of games it is used to generate the games and their walkthroughs.
The training walkthroughs are augmented with additional "human behavior" like checking the inventory between steps and moving to other rooms.
There is an additional split called "train_demo", that will be filled with human demonstration data in future work.
The most important column is the "demonstration" column. It contains the engine and player text of the generated games and can be used to train LLMs playing the game.
The game files linked in the "path" colums can be downloaded [here](https://drive.google.com/file/d/1lDeqTyQqrw06e6dAHDnXXrEC9r7a2weJ/view?usp=sharing).
|
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r4-gate_up_down | 2023-10-10T11:39:47.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-gate_up_down
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-gate_up_down)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r4-gate_up_down\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T11:38:23.134636](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r4-gate_up_down/blob/main/results_2023-10-10T11-38-23.134636.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5532792929357678,\n\
\ \"acc_stderr\": 0.03466556230198259,\n \"acc_norm\": 0.5573361420620734,\n\
\ \"acc_norm_stderr\": 0.03464703940863801,\n \"mc1\": 0.2827417380660955,\n\
\ \"mc1_stderr\": 0.01576477083677731,\n \"mc2\": 0.40761831229954526,\n\
\ \"mc2_stderr\": 0.014115250524318926\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5213310580204779,\n \"acc_stderr\": 0.014598087973127106,\n\
\ \"acc_norm\": 0.5537542662116041,\n \"acc_norm_stderr\": 0.014526705548539982\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6122286397132045,\n\
\ \"acc_stderr\": 0.004862461799370389,\n \"acc_norm\": 0.8191595299741088,\n\
\ \"acc_norm_stderr\": 0.0038409935166272614\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5460526315789473,\n \"acc_stderr\": 0.04051646342874143,\n\
\ \"acc_norm\": 0.5460526315789473,\n \"acc_norm_stderr\": 0.04051646342874143\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5924528301886792,\n \"acc_stderr\": 0.030242233800854494,\n\
\ \"acc_norm\": 0.5924528301886792,\n \"acc_norm_stderr\": 0.030242233800854494\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5902777777777778,\n\
\ \"acc_stderr\": 0.04112490974670787,\n \"acc_norm\": 0.5902777777777778,\n\
\ \"acc_norm_stderr\": 0.04112490974670787\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.049888765156985884,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.049888765156985884\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4797687861271676,\n\
\ \"acc_stderr\": 0.03809342081273957,\n \"acc_norm\": 0.4797687861271676,\n\
\ \"acc_norm_stderr\": 0.03809342081273957\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n\
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4297872340425532,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.4297872340425532,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.043727482902780064,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.043727482902780064\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3439153439153439,\n \"acc_stderr\": 0.024464426625596437,\n \"\
acc_norm\": 0.3439153439153439,\n \"acc_norm_stderr\": 0.024464426625596437\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6419354838709678,\n \"acc_stderr\": 0.027273890594300642,\n \"\
acc_norm\": 0.6419354838709678,\n \"acc_norm_stderr\": 0.027273890594300642\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.43842364532019706,\n \"acc_stderr\": 0.03491207857486519,\n \"\
acc_norm\": 0.43842364532019706,\n \"acc_norm_stderr\": 0.03491207857486519\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.036085410115739666,\n\
\ \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.036085410115739666\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7323232323232324,\n \"acc_stderr\": 0.03154449888270285,\n \"\
acc_norm\": 0.7323232323232324,\n \"acc_norm_stderr\": 0.03154449888270285\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7979274611398963,\n \"acc_stderr\": 0.02897908979429673,\n\
\ \"acc_norm\": 0.7979274611398963,\n \"acc_norm_stderr\": 0.02897908979429673\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5230769230769231,\n \"acc_stderr\": 0.025323990861736232,\n\
\ \"acc_norm\": 0.5230769230769231,\n \"acc_norm_stderr\": 0.025323990861736232\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3074074074074074,\n \"acc_stderr\": 0.02813325257881563,\n \
\ \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.02813325257881563\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5630252100840336,\n \"acc_stderr\": 0.032219436365661956,\n\
\ \"acc_norm\": 0.5630252100840336,\n \"acc_norm_stderr\": 0.032219436365661956\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7522935779816514,\n \"acc_stderr\": 0.018508143602547832,\n \"\
acc_norm\": 0.7522935779816514,\n \"acc_norm_stderr\": 0.018508143602547832\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321616,\n \"\
acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321616\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7401960784313726,\n \"acc_stderr\": 0.03077855467869326,\n \"\
acc_norm\": 0.7401960784313726,\n \"acc_norm_stderr\": 0.03077855467869326\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.759493670886076,\n \"acc_stderr\": 0.027820781981149685,\n \
\ \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.027820781981149685\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n\
\ \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.6457399103139013,\n\
\ \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5954198473282443,\n \"acc_stderr\": 0.043046937953806645,\n\
\ \"acc_norm\": 0.5954198473282443,\n \"acc_norm_stderr\": 0.043046937953806645\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.040261875275912073,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.040261875275912073\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6809815950920245,\n \"acc_stderr\": 0.03661997551073836,\n\
\ \"acc_norm\": 0.6809815950920245,\n \"acc_norm_stderr\": 0.03661997551073836\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.042878587513404565,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.042878587513404565\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.027236013946196697,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.027236013946196697\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7522349936143039,\n\
\ \"acc_stderr\": 0.01543808308056897,\n \"acc_norm\": 0.7522349936143039,\n\
\ \"acc_norm_stderr\": 0.01543808308056897\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.638728323699422,\n \"acc_stderr\": 0.025862201852277895,\n\
\ \"acc_norm\": 0.638728323699422,\n \"acc_norm_stderr\": 0.025862201852277895\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33519553072625696,\n\
\ \"acc_stderr\": 0.01578800719018589,\n \"acc_norm\": 0.33519553072625696,\n\
\ \"acc_norm_stderr\": 0.01578800719018589\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5784313725490197,\n \"acc_stderr\": 0.02827549015679146,\n\
\ \"acc_norm\": 0.5784313725490197,\n \"acc_norm_stderr\": 0.02827549015679146\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6463022508038585,\n\
\ \"acc_stderr\": 0.027155208103200868,\n \"acc_norm\": 0.6463022508038585,\n\
\ \"acc_norm_stderr\": 0.027155208103200868\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6049382716049383,\n \"acc_stderr\": 0.027201117666925647,\n\
\ \"acc_norm\": 0.6049382716049383,\n \"acc_norm_stderr\": 0.027201117666925647\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.425531914893617,\n \"acc_stderr\": 0.02949482760014437,\n \
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.02949482760014437\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4256844850065189,\n\
\ \"acc_stderr\": 0.012628393551811945,\n \"acc_norm\": 0.4256844850065189,\n\
\ \"acc_norm_stderr\": 0.012628393551811945\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.02989616303312547,\n\
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.02989616303312547\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5751633986928104,\n \"acc_stderr\": 0.019997973035458333,\n \
\ \"acc_norm\": 0.5751633986928104,\n \"acc_norm_stderr\": 0.019997973035458333\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5183673469387755,\n \"acc_stderr\": 0.03198761546763127,\n\
\ \"acc_norm\": 0.5183673469387755,\n \"acc_norm_stderr\": 0.03198761546763127\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7064676616915423,\n\
\ \"acc_stderr\": 0.03220024104534205,\n \"acc_norm\": 0.7064676616915423,\n\
\ \"acc_norm_stderr\": 0.03220024104534205\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932264,\n \
\ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932264\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n\
\ \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n\
\ \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.03246721765117826,\n\
\ \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.03246721765117826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2827417380660955,\n\
\ \"mc1_stderr\": 0.01576477083677731,\n \"mc2\": 0.40761831229954526,\n\
\ \"mc2_stderr\": 0.014115250524318926\n }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-gate_up_down
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|arc:challenge|25_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hellaswag|10_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-38-23.134636.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T11-38-23.134636.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T11-38-23.134636.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T11-38-23.134636.parquet'
- config_name: results
data_files:
- split: 2023_10_10T11_38_23.134636
path:
- results_2023-10-10T11-38-23.134636.parquet
- split: latest
path:
- results_2023-10-10T11-38-23.134636.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-gate_up_down
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-gate_up_down
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r4-gate_up_down) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r4-gate_up_down",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T11:38:23.134636](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r4-gate_up_down/blob/main/results_2023-10-10T11-38-23.134636.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5532792929357678,
"acc_stderr": 0.03466556230198259,
"acc_norm": 0.5573361420620734,
"acc_norm_stderr": 0.03464703940863801,
"mc1": 0.2827417380660955,
"mc1_stderr": 0.01576477083677731,
"mc2": 0.40761831229954526,
"mc2_stderr": 0.014115250524318926
},
"harness|arc:challenge|25": {
"acc": 0.5213310580204779,
"acc_stderr": 0.014598087973127106,
"acc_norm": 0.5537542662116041,
"acc_norm_stderr": 0.014526705548539982
},
"harness|hellaswag|10": {
"acc": 0.6122286397132045,
"acc_stderr": 0.004862461799370389,
"acc_norm": 0.8191595299741088,
"acc_norm_stderr": 0.0038409935166272614
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750574,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750574
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5460526315789473,
"acc_stderr": 0.04051646342874143,
"acc_norm": 0.5460526315789473,
"acc_norm_stderr": 0.04051646342874143
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5924528301886792,
"acc_stderr": 0.030242233800854494,
"acc_norm": 0.5924528301886792,
"acc_norm_stderr": 0.030242233800854494
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5902777777777778,
"acc_stderr": 0.04112490974670787,
"acc_norm": 0.5902777777777778,
"acc_norm_stderr": 0.04112490974670787
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.44,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4797687861271676,
"acc_stderr": 0.03809342081273957,
"acc_norm": 0.4797687861271676,
"acc_norm_stderr": 0.03809342081273957
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4297872340425532,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.4297872340425532,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.043727482902780064,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.043727482902780064
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.45517241379310347,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.45517241379310347,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3439153439153439,
"acc_stderr": 0.024464426625596437,
"acc_norm": 0.3439153439153439,
"acc_norm_stderr": 0.024464426625596437
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6419354838709678,
"acc_stderr": 0.027273890594300642,
"acc_norm": 0.6419354838709678,
"acc_norm_stderr": 0.027273890594300642
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43842364532019706,
"acc_stderr": 0.03491207857486519,
"acc_norm": 0.43842364532019706,
"acc_norm_stderr": 0.03491207857486519
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.036085410115739666,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.036085410115739666
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7323232323232324,
"acc_stderr": 0.03154449888270285,
"acc_norm": 0.7323232323232324,
"acc_norm_stderr": 0.03154449888270285
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7979274611398963,
"acc_stderr": 0.02897908979429673,
"acc_norm": 0.7979274611398963,
"acc_norm_stderr": 0.02897908979429673
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5230769230769231,
"acc_stderr": 0.025323990861736232,
"acc_norm": 0.5230769230769231,
"acc_norm_stderr": 0.025323990861736232
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.02813325257881563,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.02813325257881563
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5630252100840336,
"acc_stderr": 0.032219436365661956,
"acc_norm": 0.5630252100840336,
"acc_norm_stderr": 0.032219436365661956
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7522935779816514,
"acc_stderr": 0.018508143602547832,
"acc_norm": 0.7522935779816514,
"acc_norm_stderr": 0.018508143602547832
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321616,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321616
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7401960784313726,
"acc_stderr": 0.03077855467869326,
"acc_norm": 0.7401960784313726,
"acc_norm_stderr": 0.03077855467869326
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.027820781981149685,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.027820781981149685
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.032100621541349864,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.032100621541349864
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5954198473282443,
"acc_stderr": 0.043046937953806645,
"acc_norm": 0.5954198473282443,
"acc_norm_stderr": 0.043046937953806645
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.040261875275912073,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.040261875275912073
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6809815950920245,
"acc_stderr": 0.03661997551073836,
"acc_norm": 0.6809815950920245,
"acc_norm_stderr": 0.03661997551073836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.042878587513404565,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.042878587513404565
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.027236013946196697,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.027236013946196697
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7522349936143039,
"acc_stderr": 0.01543808308056897,
"acc_norm": 0.7522349936143039,
"acc_norm_stderr": 0.01543808308056897
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.638728323699422,
"acc_stderr": 0.025862201852277895,
"acc_norm": 0.638728323699422,
"acc_norm_stderr": 0.025862201852277895
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.33519553072625696,
"acc_stderr": 0.01578800719018589,
"acc_norm": 0.33519553072625696,
"acc_norm_stderr": 0.01578800719018589
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5784313725490197,
"acc_stderr": 0.02827549015679146,
"acc_norm": 0.5784313725490197,
"acc_norm_stderr": 0.02827549015679146
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6463022508038585,
"acc_stderr": 0.027155208103200868,
"acc_norm": 0.6463022508038585,
"acc_norm_stderr": 0.027155208103200868
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6049382716049383,
"acc_stderr": 0.027201117666925647,
"acc_norm": 0.6049382716049383,
"acc_norm_stderr": 0.027201117666925647
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.02949482760014437,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.02949482760014437
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4256844850065189,
"acc_stderr": 0.012628393551811945,
"acc_norm": 0.4256844850065189,
"acc_norm_stderr": 0.012628393551811945
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.02989616303312547,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.02989616303312547
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5751633986928104,
"acc_stderr": 0.019997973035458333,
"acc_norm": 0.5751633986928104,
"acc_norm_stderr": 0.019997973035458333
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5183673469387755,
"acc_stderr": 0.03198761546763127,
"acc_norm": 0.5183673469387755,
"acc_norm_stderr": 0.03198761546763127
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7064676616915423,
"acc_stderr": 0.03220024104534205,
"acc_norm": 0.7064676616915423,
"acc_norm_stderr": 0.03220024104534205
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932264,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932264
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42168674698795183,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.42168674698795183,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.03246721765117826,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.03246721765117826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2827417380660955,
"mc1_stderr": 0.01576477083677731,
"mc2": 0.40761831229954526,
"mc2_stderr": 0.014115250524318926
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
polinaeterna/OpenOrca | 2023-10-10T11:45:53.000Z | [
"task_categories:conversational",
"task_categories:text-classification",
"task_categories:token-classification",
"task_categories:table-question-answering",
"task_categories:question-answering",
"task_categories:zero-shot-classification",
"task_categories:summarization",
"task_categories:feature-extraction",
"task_categories:text-generation",
"task_categories:text2text-generation",
"size_categories:10M<n<100M",
"language:en",
"license:mit",
"arxiv:2306.02707",
"arxiv:2301.13688",
"region:us"
] | polinaeterna | null | null | null | 0 | 0 | ---
language:
- en
license: mit
task_categories:
- conversational
- text-classification
- token-classification
- table-question-answering
- question-answering
- zero-shot-classification
- summarization
- feature-extraction
- text-generation
- text2text-generation
pretty_name: OpenOrca
size_categories:
- 10M<n<100M
---
## Table of Contents
- [Dataset Summary](#dataset-summary)
- [Dataset Attribution](#dataset-attribution)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Dataset Use](#dataset-use)
- [Use Cases](#use-cases)
- [Usage Caveats](#usage-caveats)
- [Getting Started](#getting-started)
<p><h1>🐋 The OpenOrca Dataset! 🐋</h1></p>

<a name="dataset-announcement"></a>
We are thrilled to announce the release of the OpenOrca dataset!
This rich collection of augmented FLAN data aligns, as best as possible, with the distributions outlined in the [Orca paper](https://arxiv.org/abs/2306.02707).
It has been instrumental in generating high-performing model checkpoints and serves as a valuable resource for all NLP researchers and developers!
# Official Models
## Mistral-7B-OpenOrca
Our [latest model](https://huggingface.co/spaces/Open-Orca/Mistral-7B-OpenOrca), the first 7B to score better overall than all previous models below 30B.
98% of Llama2-70b-chat's performance, in a completely open 7B!
## OpenOrca-Platypus2-13B
Our [third model](https://huggingface.co/Open-Orca/OpenOrca-Platypus2-13B), the first 13B model to score higher than LLaMA1-65B on the HuggingFace Leaderboard!
Released in partnership with Platypus.
## LlongOrca 7B & 13B
* Our [first 7B release](https://huggingface.co/Open-Orca/LlongOrca-7B-16k), trained on top of LLongMA2 to achieve 16,000 tokens context. #1 long context 7B model at release time, with >99% of the overall #1 model's performance.
* [LlongOrca-13B-16k](https://huggingface.co/Open-Orca/LlongOrca-13B-16k), trained on top of LLongMA2. #1 long context 13B model at release time, with >97% of the overall #1 model's performance.
## OpenOrcaxOpenChat-Preview2-13B
Our [second model](https://huggingface.co/Open-Orca/OpenOrcaxOpenChat-Preview2-13B), highlighting that we've surpassed the performance reported in the Orca paper.
Was #1 at release time, now surpassed by our own OpenOrca-Platypus2-13B.
Released in partnership with OpenChat.
## OpenOrca-Preview1-13B
[OpenOrca-Preview1-13B](https://huggingface.co/Open-Orca/OpenOrca-Preview1-13B)
This model was trained in less than a day, for <$200, with <10% of our data.
At release, it beat the current state of the art models on BigBench-Hard and AGIEval. Achieves ~60% of the improvements reported in the Orca paper.
<a name="dataset-summary"></a>
# Dataset Summary
The OpenOrca dataset is a collection of augmented [FLAN Collection data](https://arxiv.org/abs/2301.13688).
Currently ~1M GPT-4 completions, and ~3.2M GPT-3.5 completions.
It is tabularized in alignment with the distributions presented in the ORCA paper and currently represents a partial completion of the full intended dataset, with ongoing generation to expand its scope.
The data is primarily used for training and evaluation in the field of natural language processing.
<a name="dataset-attribution"></a>
# Dataset Attribution
We would like to give special recognition to the following contributors for their significant efforts and dedication:
Teknium
WingLian/Caseus
Eric Hartford
NanoBit
Pankaj
Winddude
Rohan
http://AlignmentLab.ai:
Autometa
Entropi
AtlasUnified
NeverendingToast
NanoBit
WingLian/Caseus
Also of course, as always, TheBloke, for being the backbone of the whole community.
Many thanks to NanoBit and Caseus, makers of [Axolotl](https://github.com/OpenAccess-AI-Collective/axolotl), for lending us their expertise on the platform that developed and trained manticore, minotaur, and many others!
We are welcoming sponsors or collaborators to help us build these models to the scale they deserve. Please reach out via our socials:
http://Alignmentlab.ai https://discord.gg/n9hXaBPWxx
Want to visualize our full dataset? Check out our [Nomic Atlas Map](https://atlas.nomic.ai/map/c1b88b47-2d9b-47e0-9002-b80766792582/2560fd25-52fe-42f1-a58f-ff5eccc890d2).
[<img src="https://huggingface.co/Open-Orca/OpenOrca-Preview1-13B/resolve/main/OpenOrca%20Nomic%20Atlas.png" alt="Atlas Nomic Dataset Map" width="400" height="400" />](https://atlas.nomic.ai/map/c1b88b47-2d9b-47e0-9002-b80766792582/2560fd25-52fe-42f1-a58f-ff5eccc890d2)
<a name="supported-tasks-and-leaderboards"></a>
# Supported Tasks and Leaderboards
This dataset supports a range of tasks including language modeling, text generation, and text augmentation.
It has been instrumental in the generation of multiple high-performing model checkpoints which have exhibited exceptional performance in our unit testing.
Further information on leaderboards will be updated as they become available.
<a name="languages"></a>
# Languages
The language of the data is primarily English.
<a name="dataset-structure"></a>
# Dataset Structure
<a name="data-instances"></a>
## Data Instances
A data instance in this dataset represents entries from the FLAN collection which have been augmented by submitting the listed question to either GPT-4 or GPT-3.5.
The response is then entered into the response field.
<a name="data-fields"></a>
## Data Fields
The fields are:
1) 'id', a unique numbered identifier which includes one of 'niv', 't0', 'cot', or 'flan' to represent which source FLAN Collection submix the 'question' is sourced from.
2) 'system_prompt', representing the System Prompt presented to the GPT-3.5 or GPT-4 API for the datapoint
3) 'question', representing a question entry as provided by the FLAN Collection
4) 'response', a response to that question received from a query to either GPT-3.5 or GPT-4.
<a name="data-splits"></a>
## Data Splits
The data is unsplit.
<a name="dataset-creation"></a>
# Dataset Creation
<a name="curation-rationale"></a>
## Curation Rationale
The dataset was created to provide a source of augmented text data for researchers and developers.
The datapoints are intended primarily to provide an enhancement of the core FLAN Collection data which relies upon the detailed step by step reasoning capabilities of GPT-3.5 and GPT-4.
This "reasoning trace" augmentation has demonstrated exceptional results, allowing a LLaMA-13B model trained with this data to rival or beat GPT-3.5 on broad sets of hard reasoning tasks which all models below 100B parameters had previously performed dramatically worse on.
<a name="source-data"></a>
## Source Data
The data is generated using techniques in alignment with the distributions outlined in the Orca paper, except as noted below:
1) There is not enough CoT data in the FLAN Collection to generate 150K zero-shot entries, as the paper purports to use.
We suspect this portion was either undocumented or misrepresented. We have used the ~75K points available.
2) We used the pre-generated FLAN Collection datasets hosted on HuggingFace under conceptofmind, e.g. [conceptofmind/flan2021](https://huggingface.co/datasets/conceptofmind/flan2021_submix_original).
These are referenced by the [official FLAN Collection repo](https://github.com/google-research/FLAN/tree/main/flan/v2) as the preferred data source.
However, these are a subset of the full FLAN Collection data, and have less than the required entries for the flan2021 and t0 submixes, by ~1.25M and 200k respectively.
Combined, this gave us ~1.5M fewer datapoints than in the original Orca paper. Completing the set is an ongoing work.
<a name="dataset-use"></a>
# Dataset Use
<a name="use-cases"></a>
## Use Cases
The dataset can be used for tasks related to language understanding, natural language processing, machine learning model training, and model performance evaluation.
<a name="usage-caveats"></a>
## Usage Caveats
Given that this is a work-in-progress dataset, it is recommended to regularly check for updates and improvements.
Further, the data should be used in accordance with the guidelines and recommendations outlined in the Orca paper.
<a name="getting-started"></a>
## Getting Started
This dataset is organized such that it can be naively loaded via Hugging Face datasets library.
We recommend using streaming due to the large size of the files.
Regular updates and data generation progress can be monitored through the OpenOrca repository on Hugging Face.
# Citation
```bibtex
@misc{OpenOrca,
title = {OpenOrca: An Open Dataset of GPT Augmented FLAN Reasoning Traces},
author = {Wing Lian and Bleys Goodson and Eugene Pentland and Austin Cook and Chanvichet Vong and "Teknium"},
year = {2023},
publisher = {HuggingFace},
journal = {HuggingFace repository},
howpublished = {\url{https://https://huggingface.co/Open-Orca/OpenOrca},
}
```
```bibtex
@misc{mukherjee2023orca,
title={Orca: Progressive Learning from Complex Explanation Traces of GPT-4},
author={Subhabrata Mukherjee and Arindam Mitra and Ganesh Jawahar and Sahaj Agarwal and Hamid Palangi and Ahmed Awadallah},
year={2023},
eprint={2306.02707},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
```bibtex
@misc{longpre2023flan,
title={The Flan Collection: Designing Data and Methods for Effective Instruction Tuning},
author={Shayne Longpre and Le Hou and Tu Vu and Albert Webson and Hyung Won Chung and Yi Tay and Denny Zhou and Quoc V. Le and Barret Zoph and Jason Wei and Adam Roberts},
year={2023},
eprint={2301.13688},
archivePrefix={arXiv},
primaryClass={cs.AI}
}
```
```bibtex
@misc{touvron2023llama,
title={Llama 2: Open Foundation and Fine-Tuned Chat Models},
author={Hugo Touvron and Louis Martin and Kevin Stone and Peter Albert and Amjad Almahairi and Yasmine Babaei and Nikolay Bashlykov and Soumya Batra and Prajjwal Bhargava and Shruti Bhosale and Dan Bikel and Lukas Blecher and Cristian Canton Ferrer and Moya Chen and Guillem Cucurull and David Esiobu and Jude Fernandes and Jeremy Fu and Wenyin Fu and Brian Fuller and Cynthia Gao and Vedanuj Goswami and Naman Goyal and Anthony Hartshorn and Saghar Hosseini and Rui Hou and Hakan Inan and Marcin Kardas and Viktor Kerkez and Madian Khabsa and Isabel Kloumann and Artem Korenev and Punit Singh Koura and Marie-Anne Lachaux and Thibaut Lavril and Jenya Lee and Diana Liskovich and Yinghai Lu and Yuning Mao and Xavier Martinet and Todor Mihaylov and Pushkar Mishra and Igor Molybog and Yixin Nie and Andrew Poulton and Jeremy Reizenstein and Rashi Rungta and Kalyan Saladi and Alan Schelten and Ruan Silva and Eric Michael Smith and Ranjan Subramanian and Xiaoqing Ellen Tan and Binh Tang and Ross Taylor and Adina Williams and Jian Xiang Kuan and Puxin Xu and Zheng Yan and Iliyan Zarov and Yuchen Zhang and Angela Fan and Melanie Kambadur and Sharan Narang and Aurelien Rodriguez and Robert Stojnic and Sergey Edunov and Thomas Scialom},
year={2023},
eprint= arXiv 2307.09288
}
@software{touvron2023llama,
title={LLaMA: Open and Efficient Foundation Language Models},
author={Touvron, Hugo and Lavril, Thibaut and Izacard, Gautier and Martinet, Xavier and Lachaux, Marie-Anne and Lacroix, Timoth{\'e}e and Rozi{\`e}re, Baptiste and Goyal, Naman and Hambro, Eric and Azhar, Faisal and Rodriguez, Aurelien and Joulin, Armand and Grave, Edouard and Lample, Guillaume},
journal={arXiv preprint arXiv:2302.13971},
year={2023}
}
``` |
ordererecprime/ErecPrime | 2023-10-10T11:50:40.000Z | [
"region:us"
] | ordererecprime | null | null | null | 0 | 0 | Have you also tried out a plethora of products on the market that claim to boost male health but only ever got disappointed? If yes, ErecPrime is the last product that you’ll be required to **[try out now](https://snoppymart.com/erecprime/)**.
The advanced and natural formula of the magical [ErecPrime](https://snoppymart.com/erecprime/) is created to support virility and boost libido in men. With its all-natural composition, the tonic has been proven to improve the overall health and well-being of customers. Every ingredient used in this product has been well-researched and only then selected to ensure safe and effective usage of the ErecPrime.
[.png)](https://snoppymart.com/erecprime/)
As their website claims, over 88,730 users have tried and loved this product! This boosts the faith of the maker in their product and supports their claim of being one of the most effective and purest male health solutions.
There’s a lot more that you need to know about the ErecPrime and that’s exactly what we are here for today. We’ll tell you how the tonic works, what are its benefits, how it is priced, and much more! You must read this detailed review till the end to make sure you don’t miss out on any crucial information.
But first, let’s start with a quick summary of ErecPrime:
**Product Category:**
Health Supplement
**Product Name:**
ErecPrime
**Health Focus:**
Male Health
**Product Form:**
Capsules
**Side Effects:**
Currently, studies or ErecPrime Reviews have reported no side effects of using the supplement. (Check out the reviews!)
**Key Features:**
* Manufactured in the FDA-registered facility in the US
* Made with natural ingredients
* Ease of Use
* GMO-free
* Stimulant-free
* No Side Effects
**Benefits:**
* Increases libido
* Promotes healthy prostate
* Boosts energy levels
**Pricing:**
A single bottle of ErecPrime costs $69.
**Money-Back Guarantee:**
Applicable for 60 days
**ErecPrime Reviews:**
ErecPrime Reviews are normally positive.
**Where to Buy?**
You can purchase the ErecPrime only from its official website.
### Buy Link >> [https://snoppymart.com/erecprime/](https://snoppymart.com/erecprime/)
How Does The ErecPrime Work?
----------------------------
All you have to do is take a capsule of ErecPrime every day with plentiful water.
With its extraordinary formulation of mineral and plant-based extracts, the tonic will then work to target a particular enzyme, called the ‘erection enzyme’ that is responsible for your sexual performance.
This enzyme relaxes all the muscles in the penis and increases the production of nitric oxide in your body. This further improves the blood circulation to your penis and results in hard and long erections. The product gradually prepares your penis to start getting erections naturally which once seemed to be a thing of the past.
[.png)](https://snoppymart.com/erecprime/)
What Are The Natural Ingredients That Go Into The Making Of ErecPrime?
----------------------------------------------------------------------
Let’s now take a look at the ingredients present in ErecPrime that make it as effective as it is for promoting male virility:
### Rehmanniae Radix
At a molecular level, Rehmanniae Radix contains bioactive compounds such as iridoid glycosides, catalpol, and aucubin. These compounds exert their effects by interacting with various physiological processes in the body. One key mechanism by which Rehmanniae Radix supports workout performance is through its modulation of the hypothalamic-pituitary-adrenal (HPA) axis.
The HPA axis plays a crucial role in the body’s response to stress and exercise. During intense physical activity, the HPA axis is activated, leading to the release of cortisol, a hormone that helps regulate energy metabolism and response to stress. However, excessive cortisol release can have detrimental effects on workout performance and energy levels.
Rehmanniae Radix has been found to regulate the HPA axis and normalize cortisol levels. By doing so, it helps prevent the negative impact of excessive cortisol release during intense exercise. This regulation of cortisol levels contributes to improved workout performance and enhanced energy levels in men.
[Get started with the ErecPrime today!](https://snoppymart.com/erecprime/) |
hyeonddu/BANKING77 | 2023-10-10T11:49:10.000Z | [
"license:unknown",
"region:us"
] | hyeonddu | null | null | null | 0 | 0 | ---
license: unknown
---
|
loubnabnl/old_python | 2023-10-10T11:53:50.000Z | [
"region:us"
] | loubnabnl | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: __id__
dtype: int64
- name: blob_id
dtype: string
- name: directory_id
dtype: string
- name: path
dtype: string
- name: content_id
dtype: string
- name: detected_licenses
sequence: string
- name: license_type
dtype: string
- name: repo_name
dtype: string
- name: repo_url
dtype: string
- name: snapshot_id
dtype: string
- name: revision_id
dtype: string
- name: branch_name
dtype: string
- name: visit_date
dtype: timestamp[ns]
- name: revision_date
dtype: timestamp[ns]
- name: committer_date
dtype: timestamp[ns]
- name: github_id
dtype: int64
- name: star_events_count
dtype: int64
- name: fork_events_count
dtype: int64
- name: gha_license_id
dtype: string
- name: gha_fork
dtype: bool
- name: gha_event_created_at
dtype: timestamp[ns]
- name: gha_created_at
dtype: timestamp[ns]
- name: gha_updated_at
dtype: timestamp[ns]
- name: gha_pushed_at
dtype: timestamp[ns]
- name: gha_size
dtype: int64
- name: gha_stargazers_count
dtype: int32
- name: gha_forks_count
dtype: int32
- name: gha_open_issues_count
dtype: int32
- name: gha_language
dtype: string
- name: gha_archived
dtype: bool
- name: gha_disabled
dtype: bool
- name: content
dtype: string
- name: src_encoding
dtype: string
- name: language
dtype: string
- name: is_vendor
dtype: bool
- name: is_generated
dtype: bool
- name: year
dtype: int64
splits:
- name: train
num_bytes: 205861897.66555908
num_examples: 42509
download_size: 91464746
dataset_size: 205861897.66555908
---
# Dataset Card for "old_python"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Ramzey/processed_bert_dataset | 2023-10-10T13:02:03.000Z | [
"region:us"
] | Ramzey | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
- name: special_tokens_mask
sequence: int8
splits:
- name: train
num_bytes: 576000.0
num_examples: 160
download_size: 0
dataset_size: 576000.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "processed_bert_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
advancedcv/isiafoodcap_2 | 2023-10-11T01:20:22.000Z | [
"region:us"
] | advancedcv | null | null | null | 0 | 0 | Entry not found |
alimranfakir/yolov8 | 2023-10-10T11:56:23.000Z | [
"license:mit",
"region:us"
] | alimranfakir | null | null | null | 0 | 0 | ---
license: mit
---
|
datastax/philosophers-quotes | 2023-10-10T13:42:13.000Z | [
"task_categories:conversational",
"size_categories:1K<n<10K",
"language:en",
"license:mit",
"code",
"region:us"
] | datastax | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- conversational
language:
- en
tags:
- code
pretty_name: Philosophers Quote
size_categories:
- 1K<n<10K
--- |
mesude/turkishReviews-mini | 2023-10-10T12:02:30.000Z | [
"region:us"
] | mesude | null | null | null | 0 | 0 | Entry not found |
jondurbin/mathjson-alpha | 2023-10-10T12:15:58.000Z | [
"license:apache-2.0",
"region:us"
] | jondurbin | null | null | null | 2 | 0 | ---
license: apache-2.0
datasets:
- gsm8k
- meta-math/MetaMathQA
---
This is a first pass at generating MathJSON formulations of math problems to allow deterministic calculations (via cortex-js/compute-engine).
LLMs are decent at problem formulation, but terrible at calculations, especially things like calculating cosine of R radians, floating point with high precision multiplication, etc. Let's let LLMs do what they are good at and run the computation outside. |
open-llm-leaderboard/details_xDAN-AI__xDAN-L1-Thinking | 2023-10-10T12:12:04.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of xDAN-AI/xDAN-L1-Thinking
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [xDAN-AI/xDAN-L1-Thinking](https://huggingface.co/xDAN-AI/xDAN-L1-Thinking) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xDAN-AI__xDAN-L1-Thinking\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T12:10:41.690417](https://huggingface.co/datasets/open-llm-leaderboard/details_xDAN-AI__xDAN-L1-Thinking/blob/main/results_2023-10-10T12-10-41.690417.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6289386722445444,\n\
\ \"acc_stderr\": 0.033198865884709695,\n \"acc_norm\": 0.6328416238634791,\n\
\ \"acc_norm_stderr\": 0.03317472328982102,\n \"mc1\": 0.3659730722154223,\n\
\ \"mc1_stderr\": 0.01686294168408838,\n \"mc2\": 0.5212953226899916,\n\
\ \"mc2_stderr\": 0.015376349056492798\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5947098976109215,\n \"acc_stderr\": 0.014346869060229321,\n\
\ \"acc_norm\": 0.6373720136518771,\n \"acc_norm_stderr\": 0.014049106564955012\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6577375024895439,\n\
\ \"acc_stderr\": 0.0047349726682996175,\n \"acc_norm\": 0.8453495319657439,\n\
\ \"acc_norm_stderr\": 0.0036083220651418903\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n\
\ \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n\
\ \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.038424985593952694,\n\
\ \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.038424985593952694\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6679245283018868,\n \"acc_stderr\": 0.028985455652334388,\n\
\ \"acc_norm\": 0.6679245283018868,\n \"acc_norm_stderr\": 0.028985455652334388\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n\
\ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n\
\ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.032400380867927465,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.032400380867927465\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.041227371113703316,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.041227371113703316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4021164021164021,\n \"acc_stderr\": 0.025253032554997692,\n \"\
acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.025253032554997692\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7580645161290323,\n\
\ \"acc_stderr\": 0.024362599693031093,\n \"acc_norm\": 0.7580645161290323,\n\
\ \"acc_norm_stderr\": 0.024362599693031093\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.02338193534812143,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.02338193534812143\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6512820512820513,\n \"acc_stderr\": 0.02416278028401772,\n \
\ \"acc_norm\": 0.6512820512820513,\n \"acc_norm_stderr\": 0.02416278028401772\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.03077805742293167,\n \
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.03077805742293167\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8238532110091743,\n \"acc_stderr\": 0.016332882393431378,\n \"\
acc_norm\": 0.8238532110091743,\n \"acc_norm_stderr\": 0.016332882393431378\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588663,\n \"\
acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588663\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601443,\n \
\ \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601443\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.04058042015646034,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.04058042015646034\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841403,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841403\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8058748403575989,\n\
\ \"acc_stderr\": 0.01414397027665757,\n \"acc_norm\": 0.8058748403575989,\n\
\ \"acc_norm_stderr\": 0.01414397027665757\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.024547617794803828,\n\
\ \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.024547617794803828\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.30614525139664805,\n\
\ \"acc_stderr\": 0.015414494487903219,\n \"acc_norm\": 0.30614525139664805,\n\
\ \"acc_norm_stderr\": 0.015414494487903219\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7253086419753086,\n \"acc_stderr\": 0.024836057868294677,\n\
\ \"acc_norm\": 0.7253086419753086,\n \"acc_norm_stderr\": 0.024836057868294677\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4574468085106383,\n \"acc_stderr\": 0.029719281272236848,\n \
\ \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.029719281272236848\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4680573663624511,\n\
\ \"acc_stderr\": 0.012744149704869647,\n \"acc_norm\": 0.4680573663624511,\n\
\ \"acc_norm_stderr\": 0.012744149704869647\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.029163128570670733,\n\
\ \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.029163128570670733\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6601307189542484,\n \"acc_stderr\": 0.01916241858862356,\n \
\ \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.01916241858862356\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.028920583220675606,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.028920583220675606\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3659730722154223,\n\
\ \"mc1_stderr\": 0.01686294168408838,\n \"mc2\": 0.5212953226899916,\n\
\ \"mc2_stderr\": 0.015376349056492798\n }\n}\n```"
repo_url: https://huggingface.co/xDAN-AI/xDAN-L1-Thinking
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|arc:challenge|25_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hellaswag|10_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T12-10-41.690417.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T12-10-41.690417.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T12-10-41.690417.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T12-10-41.690417.parquet'
- config_name: results
data_files:
- split: 2023_10_10T12_10_41.690417
path:
- results_2023-10-10T12-10-41.690417.parquet
- split: latest
path:
- results_2023-10-10T12-10-41.690417.parquet
---
# Dataset Card for Evaluation run of xDAN-AI/xDAN-L1-Thinking
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/xDAN-AI/xDAN-L1-Thinking
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [xDAN-AI/xDAN-L1-Thinking](https://huggingface.co/xDAN-AI/xDAN-L1-Thinking) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_xDAN-AI__xDAN-L1-Thinking",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T12:10:41.690417](https://huggingface.co/datasets/open-llm-leaderboard/details_xDAN-AI__xDAN-L1-Thinking/blob/main/results_2023-10-10T12-10-41.690417.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6289386722445444,
"acc_stderr": 0.033198865884709695,
"acc_norm": 0.6328416238634791,
"acc_norm_stderr": 0.03317472328982102,
"mc1": 0.3659730722154223,
"mc1_stderr": 0.01686294168408838,
"mc2": 0.5212953226899916,
"mc2_stderr": 0.015376349056492798
},
"harness|arc:challenge|25": {
"acc": 0.5947098976109215,
"acc_stderr": 0.014346869060229321,
"acc_norm": 0.6373720136518771,
"acc_norm_stderr": 0.014049106564955012
},
"harness|hellaswag|10": {
"acc": 0.6577375024895439,
"acc_stderr": 0.0047349726682996175,
"acc_norm": 0.8453495319657439,
"acc_norm_stderr": 0.0036083220651418903
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.038424985593952694,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.038424985593952694
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6679245283018868,
"acc_stderr": 0.028985455652334388,
"acc_norm": 0.6679245283018868,
"acc_norm_stderr": 0.028985455652334388
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.032400380867927465,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.032400380867927465
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.025253032554997692,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.025253032554997692
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7580645161290323,
"acc_stderr": 0.024362599693031093,
"acc_norm": 0.7580645161290323,
"acc_norm_stderr": 0.024362599693031093
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.02338193534812143,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.02338193534812143
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6512820512820513,
"acc_stderr": 0.02416278028401772,
"acc_norm": 0.6512820512820513,
"acc_norm_stderr": 0.02416278028401772
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.03077805742293167,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.03077805742293167
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8238532110091743,
"acc_stderr": 0.016332882393431378,
"acc_norm": 0.8238532110091743,
"acc_norm_stderr": 0.016332882393431378
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588663,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588663
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601443,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601443
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.04058042015646034,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.04058042015646034
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841403,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841403
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8058748403575989,
"acc_stderr": 0.01414397027665757,
"acc_norm": 0.8058748403575989,
"acc_norm_stderr": 0.01414397027665757
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.024547617794803828,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.024547617794803828
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.30614525139664805,
"acc_stderr": 0.015414494487903219,
"acc_norm": 0.30614525139664805,
"acc_norm_stderr": 0.015414494487903219
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.026003301117885135,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.026003301117885135
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7253086419753086,
"acc_stderr": 0.024836057868294677,
"acc_norm": 0.7253086419753086,
"acc_norm_stderr": 0.024836057868294677
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4574468085106383,
"acc_stderr": 0.029719281272236848,
"acc_norm": 0.4574468085106383,
"acc_norm_stderr": 0.029719281272236848
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4680573663624511,
"acc_stderr": 0.012744149704869647,
"acc_norm": 0.4680573663624511,
"acc_norm_stderr": 0.012744149704869647
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6397058823529411,
"acc_stderr": 0.029163128570670733,
"acc_norm": 0.6397058823529411,
"acc_norm_stderr": 0.029163128570670733
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6601307189542484,
"acc_stderr": 0.01916241858862356,
"acc_norm": 0.6601307189542484,
"acc_norm_stderr": 0.01916241858862356
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.028920583220675606,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.028920583220675606
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3659730722154223,
"mc1_stderr": 0.01686294168408838,
"mc2": 0.5212953226899916,
"mc2_stderr": 0.015376349056492798
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
amphora/lmsys-finance | 2023-10-10T12:25:26.000Z | [
"task_categories:conversational",
"size_categories:n<1K",
"language:en",
"finance",
"region:us"
] | amphora | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: conversation_id
dtype: string
- name: model
dtype: string
- name: conversation
dtype: string
- name: turn
dtype: int64
- name: language
dtype: string
- name: openai_moderation
dtype: string
- name: redacted
dtype: bool
- name: count
dtype: int64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 10328855
num_examples: 735
download_size: 3912614
dataset_size: 10328855
task_categories:
- conversational
language:
- en
tags:
- finance
size_categories:
- n<1K
---
# Dataset Card for "lmsys-finance"
This dataset is a curated version of the [lmsys-chat-1m](https://huggingface.co/datasets/lmsys/lmsys-chat-1m) dataset,
focusing solely on finance-related conversations. The refinement process encompassed:
1. Removing non-English conversations.
2. Selecting conversations from models: "vicuna-33b", "wizardlm-13b", "gpt-4", "gpt-3.5-turbo", "claude-2", "palm-2", and "claude-instant-1".
3. Excluding conversations with responses under 30 characters.
4. Using 100 financial keywords, choosing conversations with at least 10 keywords. |
loubnabnl/new_py | 2023-10-10T12:16:18.000Z | [
"region:us"
] | loubnabnl | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: __id__
dtype: int64
- name: blob_id
dtype: string
- name: directory_id
dtype: string
- name: path
dtype: string
- name: content_id
dtype: string
- name: detected_licenses
sequence: string
- name: license_type
dtype: string
- name: repo_name
dtype: string
- name: repo_url
dtype: string
- name: snapshot_id
dtype: string
- name: revision_id
dtype: string
- name: branch_name
dtype: string
- name: visit_date
dtype: timestamp[ns]
- name: revision_date
dtype: timestamp[ns]
- name: committer_date
dtype: timestamp[ns]
- name: github_id
dtype: int64
- name: star_events_count
dtype: int64
- name: fork_events_count
dtype: int64
- name: gha_license_id
dtype: string
- name: gha_fork
dtype: bool
- name: gha_event_created_at
dtype: timestamp[ns]
- name: gha_created_at
dtype: timestamp[ns]
- name: gha_updated_at
dtype: timestamp[ns]
- name: gha_pushed_at
dtype: timestamp[ns]
- name: gha_size
dtype: int64
- name: gha_stargazers_count
dtype: int32
- name: gha_forks_count
dtype: int32
- name: gha_open_issues_count
dtype: int32
- name: gha_language
dtype: string
- name: gha_archived
dtype: bool
- name: gha_disabled
dtype: bool
- name: content
dtype: string
- name: src_encoding
dtype: string
- name: language
dtype: string
- name: is_vendor
dtype: bool
- name: is_generated
dtype: bool
- name: year
dtype: int64
splits:
- name: train
num_bytes: 4842783.826144089
num_examples: 1000
download_size: 1638066
dataset_size: 4842783.826144089
---
# Dataset Card for "new_py"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
loubnabnl/old_py | 2023-10-10T12:16:21.000Z | [
"region:us"
] | loubnabnl | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: __id__
dtype: int64
- name: blob_id
dtype: string
- name: directory_id
dtype: string
- name: path
dtype: string
- name: content_id
dtype: string
- name: detected_licenses
sequence: string
- name: license_type
dtype: string
- name: repo_name
dtype: string
- name: repo_url
dtype: string
- name: snapshot_id
dtype: string
- name: revision_id
dtype: string
- name: branch_name
dtype: string
- name: visit_date
dtype: timestamp[ns]
- name: revision_date
dtype: timestamp[ns]
- name: committer_date
dtype: timestamp[ns]
- name: github_id
dtype: int64
- name: star_events_count
dtype: int64
- name: fork_events_count
dtype: int64
- name: gha_license_id
dtype: string
- name: gha_fork
dtype: bool
- name: gha_event_created_at
dtype: timestamp[ns]
- name: gha_created_at
dtype: timestamp[ns]
- name: gha_updated_at
dtype: timestamp[ns]
- name: gha_pushed_at
dtype: timestamp[ns]
- name: gha_size
dtype: int64
- name: gha_stargazers_count
dtype: int32
- name: gha_forks_count
dtype: int32
- name: gha_open_issues_count
dtype: int32
- name: gha_language
dtype: string
- name: gha_archived
dtype: bool
- name: gha_disabled
dtype: bool
- name: content
dtype: string
- name: src_encoding
dtype: string
- name: language
dtype: string
- name: is_vendor
dtype: bool
- name: is_generated
dtype: bool
- name: year
dtype: int64
splits:
- name: train
num_bytes: 4842783.826144089
num_examples: 1000
download_size: 2031848
dataset_size: 4842783.826144089
---
# Dataset Card for "old_py"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
atulsinghphd/e2r-finetune-data1 | 2023-10-10T12:18:08.000Z | [
"region:us"
] | atulsinghphd | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 120837
num_examples: 430
download_size: 26024
dataset_size: 120837
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "e2r-finetune-data1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
berardi6/LBcmopcenscaspnewwsx1 | 2023-10-10T12:18:38.000Z | [
"region:us"
] | berardi6 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 507066
num_examples: 1734
download_size: 177947
dataset_size: 507066
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "LBcmopcenscaspnewwsx1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Omar1010/maraton | 2023-10-10T12:28:25.000Z | [
"region:us"
] | Omar1010 | null | null | null | 0 | 0 | Entry not found |
joheras/spanish-suicide-intent | 2023-10-10T14:20:03.000Z | [
"task_categories:text-classification",
"size_categories:100K<n<1M",
"language:es",
"license:cc-by-4.0",
"region:us"
] | joheras | null | null | null | 0 | 0 | ---
license: cc-by-4.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
- split: test
path: data/test-*
dataset_info:
features:
- name: Text
dtype: string
- name: Label
dtype: int64
- name: dataset
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 31442785
num_examples: 136136
- name: val
num_bytes: 3542897
num_examples: 15131
- name: test
num_bytes: 8671755
num_examples: 37820
download_size: 17952583
dataset_size: 43657437
task_categories:
- text-classification
language:
- es
size_categories:
- 100K<n<1M
---
## Dataset Summary
The dataset consists of comments from several sources translated to Spanish language and classified as suicidal ideation/behavior and non-suicidal.
# Dataset Structure
The dataset has 175010 rows (77223 considered as Suicidal Ideation/Behavior and 97787 considered Not Suicidal).
## Dataset fields
* `Text`: User comment.
* `Label`: 1 if suicidal ideation/behavior; 0 if not suicidal comment.
* `Dataset`: Source of the comment
# Dataset Creation
* 112385 (84485 non suicidal, 27905 suicidal) from the [Suicide Watch dataset](https://www.kaggle.com/datasets/nikhileswarkomati/suicide-watch/).
* 46894 (46894 suicidal) from the [TwitterSuicidalAnalysis](https://github.com/IE-NITK/TwitterSuicidalAnalysis).
* 9919 (9183 non suicidal, 736 suicidal) from the corpus genereated in [Hackaton Somos NLP](https://huggingface.co/datasets/hackathon-somos-nlp-2023/suicide-comments-es)
* 8744 (4802 non suicidal, 3942 suicidal) from the paper [An Attention-based hybrid architecture with explainability for depressive social media text detection in Bangla](https://github.com/NM001007/An-Attention-based-Hybrid-Suicide-Ideation-Detection)
* 7084 (3559 non suicidal, 3525 suicidal) from the paper [Supervised Learning for Suicidal Ideation Detection in Online User Content](https://github.com/TabbieD/NLP-Sentiment-Analysis)
* 1972 (1540 non suicidal, 432 suicidal) from the paper [Detection of Suicidal Intent in Spanish Language Social Networks using Machine Learning](https://github.com/kvvaldez/spanish_suicide/blob/master/dataset/suicidio_notacion.csv)
* 1769 (1122 non suicidal, 647 suicidal) from the corpus [Suicidal Tweet Detection](https://www.kaggle.com/datasets/aunanya875/suicidal-tweet-detection-dataset/data)
* 316 (204 non suicidal, 112 suicidal) from the paper [Data Mining Approach to the Detection of Suicide in Social Media: A Case Study of Singapore](https://github.com/shingkid/data-mining-suicide-sg/tree/master)
# Considerations for Using the Data
## Social Impact of Dataset
The dataset could contain some patterns to detect suicidal ideation/behavior.
## Discussion of Biases
No measures have been taken to estimate the bias and toxicity embedded in the dataset. However, the most of the data is collected on Reddit, Twitter, and ChatGPT. So there is probably an age bias because [the Internet is used more by younger people](https://www.statista.com/statistics/272365/age-distribution-of-internet-users-worldwide).
# Additional Information
## Team
* [joheras](https://huggingface.co/joheras)
|
Star-gazer/WIX1002 | 2023-10-10T13:41:21.000Z | [
"license:cc-by-nc-4.0",
"region:us"
] | Star-gazer | null | null | null | 0 | 0 | ---
license: cc-by-nc-4.0
---
These are the datasets for the assignment of WIX1002 2023.
Credit to : data.gov.my |
sleepyboyeyes/Caroline | 2023-10-10T20:01:26.000Z | [
"region:us"
] | sleepyboyeyes | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_Yukang__LongAlpaca-7B | 2023-10-10T12:49:42.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Yukang/LongAlpaca-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Yukang/LongAlpaca-7B](https://huggingface.co/Yukang/LongAlpaca-7B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Yukang__LongAlpaca-7B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T12:48:17.445800](https://huggingface.co/datasets/open-llm-leaderboard/details_Yukang__LongAlpaca-7B/blob/main/results_2023-10-10T12-48-17.445800.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2787552777179305,\n\
\ \"acc_stderr\": 0.032247803087369235,\n \"acc_norm\": 0.28195137283193583,\n\
\ \"acc_norm_stderr\": 0.03224572341092157,\n \"mc1\": 0.26560587515299877,\n\
\ \"mc1_stderr\": 0.015461027627253592,\n \"mc2\": 0.40157343839532983,\n\
\ \"mc2_stderr\": 0.015144086156771541\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.40017064846416384,\n \"acc_stderr\": 0.01431719778780919,\n\
\ \"acc_norm\": 0.42662116040955633,\n \"acc_norm_stderr\": 0.014453185592920293\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.49681338378809003,\n\
\ \"acc_stderr\": 0.004989680072717476,\n \"acc_norm\": 0.6589324835690101,\n\
\ \"acc_norm_stderr\": 0.004730991357194313\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.31851851851851853,\n\
\ \"acc_stderr\": 0.040247784019771096,\n \"acc_norm\": 0.31851851851851853,\n\
\ \"acc_norm_stderr\": 0.040247784019771096\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.23026315789473684,\n \"acc_stderr\": 0.03426059424403165,\n\
\ \"acc_norm\": 0.23026315789473684,\n \"acc_norm_stderr\": 0.03426059424403165\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.35094339622641507,\n \"acc_stderr\": 0.029373646253234686,\n\
\ \"acc_norm\": 0.35094339622641507,\n \"acc_norm_stderr\": 0.029373646253234686\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.22916666666666666,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.22916666666666666,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165085,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165085\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\"\
: 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2658959537572254,\n\
\ \"acc_stderr\": 0.033687629322594295,\n \"acc_norm\": 0.2658959537572254,\n\
\ \"acc_norm_stderr\": 0.033687629322594295\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929775,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929775\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n\
\ \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2170212765957447,\n \"acc_stderr\": 0.02694748312149623,\n\
\ \"acc_norm\": 0.2170212765957447,\n \"acc_norm_stderr\": 0.02694748312149623\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813344,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813344\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2620689655172414,\n \"acc_stderr\": 0.036646663372252565,\n\
\ \"acc_norm\": 0.2620689655172414,\n \"acc_norm_stderr\": 0.036646663372252565\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25925925925925924,\n \"acc_stderr\": 0.022569897074918428,\n \"\
acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.022569897074918428\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1349206349206349,\n\
\ \"acc_stderr\": 0.030557101589417515,\n \"acc_norm\": 0.1349206349206349,\n\
\ \"acc_norm_stderr\": 0.030557101589417515\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2870967741935484,\n\
\ \"acc_stderr\": 0.025736542745594528,\n \"acc_norm\": 0.2870967741935484,\n\
\ \"acc_norm_stderr\": 0.025736542745594528\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.23645320197044334,\n \"acc_stderr\": 0.02989611429173355,\n\
\ \"acc_norm\": 0.23645320197044334,\n \"acc_norm_stderr\": 0.02989611429173355\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368445,\n \"acc_norm\"\
: 0.2,\n \"acc_norm_stderr\": 0.040201512610368445\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2787878787878788,\n \"acc_stderr\": 0.03501438706296781,\n\
\ \"acc_norm\": 0.2787878787878788,\n \"acc_norm_stderr\": 0.03501438706296781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2828282828282828,\n \"acc_stderr\": 0.03208779558786752,\n \"\
acc_norm\": 0.2828282828282828,\n \"acc_norm_stderr\": 0.03208779558786752\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.35751295336787564,\n \"acc_stderr\": 0.034588160421810045,\n\
\ \"acc_norm\": 0.35751295336787564,\n \"acc_norm_stderr\": 0.034588160421810045\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2923076923076923,\n \"acc_stderr\": 0.023060438380857744,\n\
\ \"acc_norm\": 0.2923076923076923,\n \"acc_norm_stderr\": 0.023060438380857744\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275788,\n \
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275788\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.31092436974789917,\n \"acc_stderr\": 0.030066761582977934,\n\
\ \"acc_norm\": 0.31092436974789917,\n \"acc_norm_stderr\": 0.030066761582977934\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943342,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943342\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3192660550458716,\n\
\ \"acc_stderr\": 0.019987829069750003,\n \"acc_norm\": 0.3192660550458716,\n\
\ \"acc_norm_stderr\": 0.019987829069750003\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.03324708911809117,\n\
\ \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.03324708911809117\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.29411764705882354,\n \"acc_stderr\": 0.031980016601150726,\n \"\
acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.031980016601150726\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.31223628691983124,\n \"acc_stderr\": 0.03016513786784701,\n \
\ \"acc_norm\": 0.31223628691983124,\n \"acc_norm_stderr\": 0.03016513786784701\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.20179372197309417,\n\
\ \"acc_stderr\": 0.026936111912802273,\n \"acc_norm\": 0.20179372197309417,\n\
\ \"acc_norm_stderr\": 0.026936111912802273\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.366412213740458,\n \"acc_stderr\": 0.042258754519696386,\n\
\ \"acc_norm\": 0.366412213740458,\n \"acc_norm_stderr\": 0.042258754519696386\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.23140495867768596,\n \"acc_stderr\": 0.03849856098794088,\n \"\
acc_norm\": 0.23140495867768596,\n \"acc_norm_stderr\": 0.03849856098794088\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.24074074074074073,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.24074074074074073,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.24539877300613497,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.24539877300613497,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.17857142857142858,\n\
\ \"acc_stderr\": 0.03635209121577806,\n \"acc_norm\": 0.17857142857142858,\n\
\ \"acc_norm_stderr\": 0.03635209121577806\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.32038834951456313,\n \"acc_stderr\": 0.046202840822800406,\n\
\ \"acc_norm\": 0.32038834951456313,\n \"acc_norm_stderr\": 0.046202840822800406\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.31196581196581197,\n\
\ \"acc_stderr\": 0.030351527323344958,\n \"acc_norm\": 0.31196581196581197,\n\
\ \"acc_norm_stderr\": 0.030351527323344958\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.28991060025542786,\n\
\ \"acc_stderr\": 0.016225017944770978,\n \"acc_norm\": 0.28991060025542786,\n\
\ \"acc_norm_stderr\": 0.016225017944770978\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.22832369942196531,\n \"acc_stderr\": 0.022598703804321617,\n\
\ \"acc_norm\": 0.22832369942196531,\n \"acc_norm_stderr\": 0.022598703804321617\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.025261691219729494,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.025261691219729494\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2604501607717042,\n\
\ \"acc_stderr\": 0.024926723224845543,\n \"acc_norm\": 0.2604501607717042,\n\
\ \"acc_norm_stderr\": 0.024926723224845543\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.024383665531035454,\n\
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.024383665531035454\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24468085106382978,\n \"acc_stderr\": 0.025645553622266736,\n \
\ \"acc_norm\": 0.24468085106382978,\n \"acc_norm_stderr\": 0.025645553622266736\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24902216427640156,\n\
\ \"acc_stderr\": 0.011044892264040772,\n \"acc_norm\": 0.24902216427640156,\n\
\ \"acc_norm_stderr\": 0.011044892264040772\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.39705882352941174,\n \"acc_stderr\": 0.029722152099280058,\n\
\ \"acc_norm\": 0.39705882352941174,\n \"acc_norm_stderr\": 0.029722152099280058\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.22712418300653595,\n \"acc_stderr\": 0.016949853279212373,\n \
\ \"acc_norm\": 0.22712418300653595,\n \"acc_norm_stderr\": 0.016949853279212373\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2636363636363636,\n\
\ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.2636363636363636,\n\
\ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.32653061224489793,\n \"acc_stderr\": 0.03002105623844031,\n\
\ \"acc_norm\": 0.32653061224489793,\n \"acc_norm_stderr\": 0.03002105623844031\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.26865671641791045,\n\
\ \"acc_stderr\": 0.031343283582089536,\n \"acc_norm\": 0.26865671641791045,\n\
\ \"acc_norm_stderr\": 0.031343283582089536\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3192771084337349,\n\
\ \"acc_stderr\": 0.0362933532994786,\n \"acc_norm\": 0.3192771084337349,\n\
\ \"acc_norm_stderr\": 0.0362933532994786\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.25146198830409355,\n \"acc_stderr\": 0.033275044238468436,\n\
\ \"acc_norm\": 0.25146198830409355,\n \"acc_norm_stderr\": 0.033275044238468436\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26560587515299877,\n\
\ \"mc1_stderr\": 0.015461027627253592,\n \"mc2\": 0.40157343839532983,\n\
\ \"mc2_stderr\": 0.015144086156771541\n }\n}\n```"
repo_url: https://huggingface.co/Yukang/LongAlpaca-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|arc:challenge|25_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hellaswag|10_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T12-48-17.445800.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T12-48-17.445800.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T12-48-17.445800.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T12-48-17.445800.parquet'
- config_name: results
data_files:
- split: 2023_10_10T12_48_17.445800
path:
- results_2023-10-10T12-48-17.445800.parquet
- split: latest
path:
- results_2023-10-10T12-48-17.445800.parquet
---
# Dataset Card for Evaluation run of Yukang/LongAlpaca-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Yukang/LongAlpaca-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Yukang/LongAlpaca-7B](https://huggingface.co/Yukang/LongAlpaca-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Yukang__LongAlpaca-7B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T12:48:17.445800](https://huggingface.co/datasets/open-llm-leaderboard/details_Yukang__LongAlpaca-7B/blob/main/results_2023-10-10T12-48-17.445800.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2787552777179305,
"acc_stderr": 0.032247803087369235,
"acc_norm": 0.28195137283193583,
"acc_norm_stderr": 0.03224572341092157,
"mc1": 0.26560587515299877,
"mc1_stderr": 0.015461027627253592,
"mc2": 0.40157343839532983,
"mc2_stderr": 0.015144086156771541
},
"harness|arc:challenge|25": {
"acc": 0.40017064846416384,
"acc_stderr": 0.01431719778780919,
"acc_norm": 0.42662116040955633,
"acc_norm_stderr": 0.014453185592920293
},
"harness|hellaswag|10": {
"acc": 0.49681338378809003,
"acc_stderr": 0.004989680072717476,
"acc_norm": 0.6589324835690101,
"acc_norm_stderr": 0.004730991357194313
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.040247784019771096,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.040247784019771096
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.23026315789473684,
"acc_stderr": 0.03426059424403165,
"acc_norm": 0.23026315789473684,
"acc_norm_stderr": 0.03426059424403165
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.35094339622641507,
"acc_stderr": 0.029373646253234686,
"acc_norm": 0.35094339622641507,
"acc_norm_stderr": 0.029373646253234686
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.22916666666666666,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.22916666666666666,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165085,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165085
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2658959537572254,
"acc_stderr": 0.033687629322594295,
"acc_norm": 0.2658959537572254,
"acc_norm_stderr": 0.033687629322594295
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929775,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929775
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2170212765957447,
"acc_stderr": 0.02694748312149623,
"acc_norm": 0.2170212765957447,
"acc_norm_stderr": 0.02694748312149623
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813344,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813344
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2620689655172414,
"acc_stderr": 0.036646663372252565,
"acc_norm": 0.2620689655172414,
"acc_norm_stderr": 0.036646663372252565
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.022569897074918428,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.022569897074918428
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1349206349206349,
"acc_stderr": 0.030557101589417515,
"acc_norm": 0.1349206349206349,
"acc_norm_stderr": 0.030557101589417515
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2870967741935484,
"acc_stderr": 0.025736542745594528,
"acc_norm": 0.2870967741935484,
"acc_norm_stderr": 0.025736542745594528
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.23645320197044334,
"acc_stderr": 0.02989611429173355,
"acc_norm": 0.23645320197044334,
"acc_norm_stderr": 0.02989611429173355
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2787878787878788,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.2787878787878788,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2828282828282828,
"acc_stderr": 0.03208779558786752,
"acc_norm": 0.2828282828282828,
"acc_norm_stderr": 0.03208779558786752
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.35751295336787564,
"acc_stderr": 0.034588160421810045,
"acc_norm": 0.35751295336787564,
"acc_norm_stderr": 0.034588160421810045
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2923076923076923,
"acc_stderr": 0.023060438380857744,
"acc_norm": 0.2923076923076923,
"acc_norm_stderr": 0.023060438380857744
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.026067159222275788,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.026067159222275788
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.31092436974789917,
"acc_stderr": 0.030066761582977934,
"acc_norm": 0.31092436974789917,
"acc_norm_stderr": 0.030066761582977934
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943342,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943342
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3192660550458716,
"acc_stderr": 0.019987829069750003,
"acc_norm": 0.3192660550458716,
"acc_norm_stderr": 0.019987829069750003
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.03324708911809117,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.03324708911809117
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.031980016601150726,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.031980016601150726
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.31223628691983124,
"acc_stderr": 0.03016513786784701,
"acc_norm": 0.31223628691983124,
"acc_norm_stderr": 0.03016513786784701
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.20179372197309417,
"acc_stderr": 0.026936111912802273,
"acc_norm": 0.20179372197309417,
"acc_norm_stderr": 0.026936111912802273
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.366412213740458,
"acc_stderr": 0.042258754519696386,
"acc_norm": 0.366412213740458,
"acc_norm_stderr": 0.042258754519696386
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.23140495867768596,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.23140495867768596,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.24539877300613497,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.24539877300613497,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.17857142857142858,
"acc_stderr": 0.03635209121577806,
"acc_norm": 0.17857142857142858,
"acc_norm_stderr": 0.03635209121577806
},
"harness|hendrycksTest-management|5": {
"acc": 0.32038834951456313,
"acc_stderr": 0.046202840822800406,
"acc_norm": 0.32038834951456313,
"acc_norm_stderr": 0.046202840822800406
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.31196581196581197,
"acc_stderr": 0.030351527323344958,
"acc_norm": 0.31196581196581197,
"acc_norm_stderr": 0.030351527323344958
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.28991060025542786,
"acc_stderr": 0.016225017944770978,
"acc_norm": 0.28991060025542786,
"acc_norm_stderr": 0.016225017944770978
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.22832369942196531,
"acc_stderr": 0.022598703804321617,
"acc_norm": 0.22832369942196531,
"acc_norm_stderr": 0.022598703804321617
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.025261691219729494,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.025261691219729494
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2604501607717042,
"acc_stderr": 0.024926723224845543,
"acc_norm": 0.2604501607717042,
"acc_norm_stderr": 0.024926723224845543
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.024383665531035454,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.024383665531035454
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24468085106382978,
"acc_stderr": 0.025645553622266736,
"acc_norm": 0.24468085106382978,
"acc_norm_stderr": 0.025645553622266736
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24902216427640156,
"acc_stderr": 0.011044892264040772,
"acc_norm": 0.24902216427640156,
"acc_norm_stderr": 0.011044892264040772
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.39705882352941174,
"acc_stderr": 0.029722152099280058,
"acc_norm": 0.39705882352941174,
"acc_norm_stderr": 0.029722152099280058
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.22712418300653595,
"acc_stderr": 0.016949853279212373,
"acc_norm": 0.22712418300653595,
"acc_norm_stderr": 0.016949853279212373
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2636363636363636,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.2636363636363636,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.32653061224489793,
"acc_stderr": 0.03002105623844031,
"acc_norm": 0.32653061224489793,
"acc_norm_stderr": 0.03002105623844031
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.26865671641791045,
"acc_stderr": 0.031343283582089536,
"acc_norm": 0.26865671641791045,
"acc_norm_stderr": 0.031343283582089536
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3192771084337349,
"acc_stderr": 0.0362933532994786,
"acc_norm": 0.3192771084337349,
"acc_norm_stderr": 0.0362933532994786
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.25146198830409355,
"acc_stderr": 0.033275044238468436,
"acc_norm": 0.25146198830409355,
"acc_norm_stderr": 0.033275044238468436
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26560587515299877,
"mc1_stderr": 0.015461027627253592,
"mc2": 0.40157343839532983,
"mc2_stderr": 0.015144086156771541
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
NbAiLab/nbnn_language_detection | 2023-10-10T13:49:49.000Z | [
"region:us"
] | NbAiLab | null | null | null | 0 | 0 | # Dataset Card for Bokmål-Nynorsk Language Detection (main_train_split)
## Dataset Summary
This dataset is intended for language detection for Bokmål to Nynorsk and vice versa. It contains 800,000 sentence pairs, sourced from Språkbanken and pruned to avoid overlap with the NorBench dataset. The data comes from translations of news text from Norsk telegrambyrå (NTB), performed by Nynorsk pressekontor (NPK). In addition the dev and test set has 1000 entries.
## Data Collection
- **Period**: February 2011 to December 2022
- **Source**: [Språkbanken](https://www.nb.no/sprakbanken/)
- **Size**: 800,000 sentence pairs
- **Format**: JSON-lines (with `text` , `label`, `label_full` fields)
### Processing Steps
1. Pruned to avoid overlap with NorBench
2. Deduplicated
3. Shuffled with a fixed seed (42)
## Usage
Intended for training Bokmål-Nynorsk detection models. For more details, refer to the repository where the dataset preparation script and the actual dataset reside. |
MoaazId/cityscape_Fine | 2023-10-10T13:07:28.000Z | [
"region:us"
] | MoaazId | null | null | null | 0 | 0 | Entry not found |
ostapeno/wiki_platypus_inverse_mmlu_icl5_cleaned_1_iter | 2023-10-10T12:56:18.000Z | [
"region:us"
] | ostapeno | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: context
dtype: string
- name: subject
dtype: string
- name: response
dtype: string
- name: author_instr
dtype: string
- name: inst_index_for_context
dtype: 'null'
- name: author_response
dtype: string
- name: normalized_cumul_logprob_response
dtype: float64
splits:
- name: formal_logic
num_bytes: 17985531.444360264
num_examples: 4093
- name: machine_learning
num_bytes: 23935301.680290826
num_examples: 5447
- name: global_facts
num_bytes: 22775228.310803626
num_examples: 5183
- name: abstract_algebra
num_bytes: 17893252.880878326
num_examples: 4072
- name: high_school_physics
num_bytes: 35377843.55205093
num_examples: 8051
- name: college_biology
num_bytes: 32323862.522529706
num_examples: 7356
- name: high_school_government_and_politics
num_bytes: 41389132.83030279
num_examples: 9419
- name: prehistory
num_bytes: 54246612.67545259
num_examples: 12345
- name: security_studies
num_bytes: 44834199.200295076
num_examples: 10203
- name: sociology
num_bytes: 39833579.903035864
num_examples: 9065
download_size: 103829505
dataset_size: 330594545.0
---
# Dataset Card for "wiki_platypus_inverse_mmlu_icl5_cleaned_1_iter"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_undi95__llama2-to-mistral-diff | 2023-10-10T12:57:12.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of undi95/llama2-to-mistral-diff
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [undi95/llama2-to-mistral-diff](https://huggingface.co/undi95/llama2-to-mistral-diff)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_undi95__llama2-to-mistral-diff\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T12:55:48.397880](https://huggingface.co/datasets/open-llm-leaderboard/details_undi95__llama2-to-mistral-diff/blob/main/results_2023-10-10T12-55-48.397880.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4668758865288765,\n\
\ \"acc_stderr\": 0.03526795867551185,\n \"acc_norm\": 0.47089073297233314,\n\
\ \"acc_norm_stderr\": 0.03525358948223725,\n \"mc1\": 0.2484700122399021,\n\
\ \"mc1_stderr\": 0.01512742709652068,\n \"mc2\": 0.38714596689664715,\n\
\ \"mc2_stderr\": 0.013504367947573348\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.49402730375426623,\n \"acc_stderr\": 0.014610348300255793,\n\
\ \"acc_norm\": 0.5341296928327645,\n \"acc_norm_stderr\": 0.014577311315231102\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5888269269069907,\n\
\ \"acc_stderr\": 0.004910409150135491,\n \"acc_norm\": 0.7856004779924318,\n\
\ \"acc_norm_stderr\": 0.004095663731959219\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45185185185185184,\n\
\ \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.45185185185185184,\n\
\ \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.40131578947368424,\n \"acc_stderr\": 0.03988903703336284,\n\
\ \"acc_norm\": 0.40131578947368424,\n \"acc_norm_stderr\": 0.03988903703336284\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4641509433962264,\n \"acc_stderr\": 0.030693675018458003,\n\
\ \"acc_norm\": 0.4641509433962264,\n \"acc_norm_stderr\": 0.030693675018458003\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4513888888888889,\n\
\ \"acc_stderr\": 0.04161402398403279,\n \"acc_norm\": 0.4513888888888889,\n\
\ \"acc_norm_stderr\": 0.04161402398403279\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.34,\n\
\ \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \
\ \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4161849710982659,\n\
\ \"acc_stderr\": 0.03758517775404948,\n \"acc_norm\": 0.4161849710982659,\n\
\ \"acc_norm_stderr\": 0.03758517775404948\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171453,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171453\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4340425531914894,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.4340425531914894,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159393,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159393\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n\
\ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.26455026455026454,\n \"acc_stderr\": 0.022717467897708624,\n \"\
acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.022717467897708624\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30158730158730157,\n\
\ \"acc_stderr\": 0.04104947269903394,\n \"acc_norm\": 0.30158730158730157,\n\
\ \"acc_norm_stderr\": 0.04104947269903394\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.49032258064516127,\n\
\ \"acc_stderr\": 0.02843867799890955,\n \"acc_norm\": 0.49032258064516127,\n\
\ \"acc_norm_stderr\": 0.02843867799890955\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.35960591133004927,\n \"acc_stderr\": 0.03376458246509566,\n\
\ \"acc_norm\": 0.35960591133004927,\n \"acc_norm_stderr\": 0.03376458246509566\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\"\
: 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.038254602783800246,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.038254602783800246\n \
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4797979797979798,\n \"acc_stderr\": 0.0355944356556392,\n \"acc_norm\"\
: 0.4797979797979798,\n \"acc_norm_stderr\": 0.0355944356556392\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.6994818652849741,\n \"acc_stderr\": 0.03308818594415751,\n\
\ \"acc_norm\": 0.6994818652849741,\n \"acc_norm_stderr\": 0.03308818594415751\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.44871794871794873,\n \"acc_stderr\": 0.025217315184846482,\n\
\ \"acc_norm\": 0.44871794871794873,\n \"acc_norm_stderr\": 0.025217315184846482\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \
\ \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.032145368597886394,\n\
\ \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.032145368597886394\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.03734535676787198,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.03734535676787198\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6293577981651376,\n \"acc_stderr\": 0.02070745816435298,\n \"\
acc_norm\": 0.6293577981651376,\n \"acc_norm_stderr\": 0.02070745816435298\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.26851851851851855,\n \"acc_stderr\": 0.030225226160012393,\n \"\
acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.030225226160012393\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5490196078431373,\n \"acc_stderr\": 0.03492406104163613,\n \"\
acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.03492406104163613\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6033755274261603,\n \"acc_stderr\": 0.03184399873811225,\n \
\ \"acc_norm\": 0.6033755274261603,\n \"acc_norm_stderr\": 0.03184399873811225\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5560538116591929,\n\
\ \"acc_stderr\": 0.03334625674242728,\n \"acc_norm\": 0.5560538116591929,\n\
\ \"acc_norm_stderr\": 0.03334625674242728\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5419847328244275,\n \"acc_stderr\": 0.04369802690578756,\n\
\ \"acc_norm\": 0.5419847328244275,\n \"acc_norm_stderr\": 0.04369802690578756\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6528925619834711,\n \"acc_stderr\": 0.04345724570292534,\n \"\
acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.04345724570292534\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.04830366024635331,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.04830366024635331\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5276073619631901,\n \"acc_stderr\": 0.0392237829061099,\n\
\ \"acc_norm\": 0.5276073619631901,\n \"acc_norm_stderr\": 0.0392237829061099\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5533980582524272,\n \"acc_stderr\": 0.04922424153458933,\n\
\ \"acc_norm\": 0.5533980582524272,\n \"acc_norm_stderr\": 0.04922424153458933\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6923076923076923,\n\
\ \"acc_stderr\": 0.03023638994217309,\n \"acc_norm\": 0.6923076923076923,\n\
\ \"acc_norm_stderr\": 0.03023638994217309\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6411238825031929,\n\
\ \"acc_stderr\": 0.017152991797501342,\n \"acc_norm\": 0.6411238825031929,\n\
\ \"acc_norm_stderr\": 0.017152991797501342\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.49421965317919075,\n \"acc_stderr\": 0.026917296179149116,\n\
\ \"acc_norm\": 0.49421965317919075,\n \"acc_norm_stderr\": 0.026917296179149116\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.49673202614379086,\n \"acc_stderr\": 0.02862930519400354,\n\
\ \"acc_norm\": 0.49673202614379086,\n \"acc_norm_stderr\": 0.02862930519400354\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6012861736334405,\n\
\ \"acc_stderr\": 0.0278093225857745,\n \"acc_norm\": 0.6012861736334405,\n\
\ \"acc_norm_stderr\": 0.0278093225857745\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.02782074420373286,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.02782074420373286\n },\n\
\ \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3546099290780142,\n\
\ \"acc_stderr\": 0.028538650028878638,\n \"acc_norm\": 0.3546099290780142,\n\
\ \"acc_norm_stderr\": 0.028538650028878638\n },\n \"harness|hendrycksTest-professional_law|5\"\
: {\n \"acc\": 0.37222946544980445,\n \"acc_stderr\": 0.012346241297204368,\n\
\ \"acc_norm\": 0.37222946544980445,\n \"acc_norm_stderr\": 0.012346241297204368\n\
\ },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\"\
: 0.5073529411764706,\n \"acc_stderr\": 0.030369552523902173,\n \"\
acc_norm\": 0.5073529411764706,\n \"acc_norm_stderr\": 0.030369552523902173\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4444444444444444,\n \"acc_stderr\": 0.020102583895887184,\n \
\ \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.020102583895887184\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n\
\ \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.5363636363636364,\n\
\ \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.49387755102040815,\n \"acc_stderr\": 0.03200682020163908,\n\
\ \"acc_norm\": 0.49387755102040815,\n \"acc_norm_stderr\": 0.03200682020163908\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6318407960199005,\n\
\ \"acc_stderr\": 0.03410410565495301,\n \"acc_norm\": 0.6318407960199005,\n\
\ \"acc_norm_stderr\": 0.03410410565495301\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.695906432748538,\n \"acc_stderr\": 0.0352821125824523,\n\
\ \"acc_norm\": 0.695906432748538,\n \"acc_norm_stderr\": 0.0352821125824523\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2484700122399021,\n\
\ \"mc1_stderr\": 0.01512742709652068,\n \"mc2\": 0.38714596689664715,\n\
\ \"mc2_stderr\": 0.013504367947573348\n }\n}\n```"
repo_url: https://huggingface.co/undi95/llama2-to-mistral-diff
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|arc:challenge|25_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hellaswag|10_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T12-55-48.397880.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T12-55-48.397880.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T12-55-48.397880.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T12-55-48.397880.parquet'
- config_name: results
data_files:
- split: 2023_10_10T12_55_48.397880
path:
- results_2023-10-10T12-55-48.397880.parquet
- split: latest
path:
- results_2023-10-10T12-55-48.397880.parquet
---
# Dataset Card for Evaluation run of undi95/llama2-to-mistral-diff
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/undi95/llama2-to-mistral-diff
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [undi95/llama2-to-mistral-diff](https://huggingface.co/undi95/llama2-to-mistral-diff) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_undi95__llama2-to-mistral-diff",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T12:55:48.397880](https://huggingface.co/datasets/open-llm-leaderboard/details_undi95__llama2-to-mistral-diff/blob/main/results_2023-10-10T12-55-48.397880.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4668758865288765,
"acc_stderr": 0.03526795867551185,
"acc_norm": 0.47089073297233314,
"acc_norm_stderr": 0.03525358948223725,
"mc1": 0.2484700122399021,
"mc1_stderr": 0.01512742709652068,
"mc2": 0.38714596689664715,
"mc2_stderr": 0.013504367947573348
},
"harness|arc:challenge|25": {
"acc": 0.49402730375426623,
"acc_stderr": 0.014610348300255793,
"acc_norm": 0.5341296928327645,
"acc_norm_stderr": 0.014577311315231102
},
"harness|hellaswag|10": {
"acc": 0.5888269269069907,
"acc_stderr": 0.004910409150135491,
"acc_norm": 0.7856004779924318,
"acc_norm_stderr": 0.004095663731959219
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45185185185185184,
"acc_stderr": 0.04299268905480864,
"acc_norm": 0.45185185185185184,
"acc_norm_stderr": 0.04299268905480864
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.40131578947368424,
"acc_stderr": 0.03988903703336284,
"acc_norm": 0.40131578947368424,
"acc_norm_stderr": 0.03988903703336284
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4641509433962264,
"acc_stderr": 0.030693675018458003,
"acc_norm": 0.4641509433962264,
"acc_norm_stderr": 0.030693675018458003
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4513888888888889,
"acc_stderr": 0.04161402398403279,
"acc_norm": 0.4513888888888889,
"acc_norm_stderr": 0.04161402398403279
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4161849710982659,
"acc_stderr": 0.03758517775404948,
"acc_norm": 0.4161849710982659,
"acc_norm_stderr": 0.03758517775404948
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171453,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171453
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4340425531914894,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.4340425531914894,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159393,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159393
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.26455026455026454,
"acc_stderr": 0.022717467897708624,
"acc_norm": 0.26455026455026454,
"acc_norm_stderr": 0.022717467897708624
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.04104947269903394,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.04104947269903394
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.49032258064516127,
"acc_stderr": 0.02843867799890955,
"acc_norm": 0.49032258064516127,
"acc_norm_stderr": 0.02843867799890955
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35960591133004927,
"acc_stderr": 0.03376458246509566,
"acc_norm": 0.35960591133004927,
"acc_norm_stderr": 0.03376458246509566
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6,
"acc_stderr": 0.038254602783800246,
"acc_norm": 0.6,
"acc_norm_stderr": 0.038254602783800246
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4797979797979798,
"acc_stderr": 0.0355944356556392,
"acc_norm": 0.4797979797979798,
"acc_norm_stderr": 0.0355944356556392
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6994818652849741,
"acc_stderr": 0.03308818594415751,
"acc_norm": 0.6994818652849741,
"acc_norm_stderr": 0.03308818594415751
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.44871794871794873,
"acc_stderr": 0.025217315184846482,
"acc_norm": 0.44871794871794873,
"acc_norm_stderr": 0.025217315184846482
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.027634907264178544,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.027634907264178544
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.032145368597886394,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.032145368597886394
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.03734535676787198,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.03734535676787198
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6293577981651376,
"acc_stderr": 0.02070745816435298,
"acc_norm": 0.6293577981651376,
"acc_norm_stderr": 0.02070745816435298
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.030225226160012393,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.030225226160012393
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.03492406104163613,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.03492406104163613
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6033755274261603,
"acc_stderr": 0.03184399873811225,
"acc_norm": 0.6033755274261603,
"acc_norm_stderr": 0.03184399873811225
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5560538116591929,
"acc_stderr": 0.03334625674242728,
"acc_norm": 0.5560538116591929,
"acc_norm_stderr": 0.03334625674242728
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5419847328244275,
"acc_stderr": 0.04369802690578756,
"acc_norm": 0.5419847328244275,
"acc_norm_stderr": 0.04369802690578756
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6528925619834711,
"acc_stderr": 0.04345724570292534,
"acc_norm": 0.6528925619834711,
"acc_norm_stderr": 0.04345724570292534
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.04830366024635331,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.04830366024635331
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5276073619631901,
"acc_stderr": 0.0392237829061099,
"acc_norm": 0.5276073619631901,
"acc_norm_stderr": 0.0392237829061099
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.5533980582524272,
"acc_stderr": 0.04922424153458933,
"acc_norm": 0.5533980582524272,
"acc_norm_stderr": 0.04922424153458933
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6923076923076923,
"acc_stderr": 0.03023638994217309,
"acc_norm": 0.6923076923076923,
"acc_norm_stderr": 0.03023638994217309
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6411238825031929,
"acc_stderr": 0.017152991797501342,
"acc_norm": 0.6411238825031929,
"acc_norm_stderr": 0.017152991797501342
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.49421965317919075,
"acc_stderr": 0.026917296179149116,
"acc_norm": 0.49421965317919075,
"acc_norm_stderr": 0.026917296179149116
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.49673202614379086,
"acc_stderr": 0.02862930519400354,
"acc_norm": 0.49673202614379086,
"acc_norm_stderr": 0.02862930519400354
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6012861736334405,
"acc_stderr": 0.0278093225857745,
"acc_norm": 0.6012861736334405,
"acc_norm_stderr": 0.0278093225857745
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5,
"acc_stderr": 0.02782074420373286,
"acc_norm": 0.5,
"acc_norm_stderr": 0.02782074420373286
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3546099290780142,
"acc_stderr": 0.028538650028878638,
"acc_norm": 0.3546099290780142,
"acc_norm_stderr": 0.028538650028878638
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.37222946544980445,
"acc_stderr": 0.012346241297204368,
"acc_norm": 0.37222946544980445,
"acc_norm_stderr": 0.012346241297204368
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5073529411764706,
"acc_stderr": 0.030369552523902173,
"acc_norm": 0.5073529411764706,
"acc_norm_stderr": 0.030369552523902173
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.020102583895887184,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.020102583895887184
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5363636363636364,
"acc_stderr": 0.04776449162396197,
"acc_norm": 0.5363636363636364,
"acc_norm_stderr": 0.04776449162396197
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.49387755102040815,
"acc_stderr": 0.03200682020163908,
"acc_norm": 0.49387755102040815,
"acc_norm_stderr": 0.03200682020163908
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6318407960199005,
"acc_stderr": 0.03410410565495301,
"acc_norm": 0.6318407960199005,
"acc_norm_stderr": 0.03410410565495301
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.695906432748538,
"acc_stderr": 0.0352821125824523,
"acc_norm": 0.695906432748538,
"acc_norm_stderr": 0.0352821125824523
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2484700122399021,
"mc1_stderr": 0.01512742709652068,
"mc2": 0.38714596689664715,
"mc2_stderr": 0.013504367947573348
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_The-Face-Of-Goonery__Huginn-19b-prototype | 2023-10-10T13:01:26.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of The-Face-Of-Goonery/Huginn-19b-prototype
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [The-Face-Of-Goonery/Huginn-19b-prototype](https://huggingface.co/The-Face-Of-Goonery/Huginn-19b-prototype)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_The-Face-Of-Goonery__Huginn-19b-prototype\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T13:00:00.797867](https://huggingface.co/datasets/open-llm-leaderboard/details_The-Face-Of-Goonery__Huginn-19b-prototype/blob/main/results_2023-10-10T13-00-00.797867.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5582906573142512,\n\
\ \"acc_stderr\": 0.03432283652142382,\n \"acc_norm\": 0.5621813333085338,\n\
\ \"acc_norm_stderr\": 0.034304199441954245,\n \"mc1\": 0.2766217870257038,\n\
\ \"mc1_stderr\": 0.015659605755326923,\n \"mc2\": 0.41150494590178466,\n\
\ \"mc2_stderr\": 0.01462306768928086\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.552901023890785,\n \"acc_stderr\": 0.014529380160526843,\n\
\ \"acc_norm\": 0.5921501706484642,\n \"acc_norm_stderr\": 0.014361097288449703\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6199960167297351,\n\
\ \"acc_stderr\": 0.004843954338451441,\n \"acc_norm\": 0.8102967536347341,\n\
\ \"acc_norm_stderr\": 0.003912649521823133\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5394736842105263,\n \"acc_stderr\": 0.04056242252249034,\n\
\ \"acc_norm\": 0.5394736842105263,\n \"acc_norm_stderr\": 0.04056242252249034\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6339622641509434,\n \"acc_stderr\": 0.029647813539365245,\n\
\ \"acc_norm\": 0.6339622641509434,\n \"acc_norm_stderr\": 0.029647813539365245\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6041666666666666,\n\
\ \"acc_stderr\": 0.04089465449325582,\n \"acc_norm\": 0.6041666666666666,\n\
\ \"acc_norm_stderr\": 0.04089465449325582\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.45,\n\
\ \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.45,\n \
\ \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939098,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939098\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5202312138728323,\n\
\ \"acc_stderr\": 0.03809342081273957,\n \"acc_norm\": 0.5202312138728323,\n\
\ \"acc_norm_stderr\": 0.03809342081273957\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929776,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929776\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.41702127659574467,\n \"acc_stderr\": 0.03223276266711712,\n\
\ \"acc_norm\": 0.41702127659574467,\n \"acc_norm_stderr\": 0.03223276266711712\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3412698412698413,\n \"acc_stderr\": 0.024419234966819067,\n \"\
acc_norm\": 0.3412698412698413,\n \"acc_norm_stderr\": 0.024419234966819067\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.042163702135578345,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.042163702135578345\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6451612903225806,\n \"acc_stderr\": 0.027218889773308764,\n \"\
acc_norm\": 0.6451612903225806,\n \"acc_norm_stderr\": 0.027218889773308764\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n \"\
acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6848484848484848,\n \"acc_stderr\": 0.0362773057502241,\n\
\ \"acc_norm\": 0.6848484848484848,\n \"acc_norm_stderr\": 0.0362773057502241\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.702020202020202,\n \"acc_stderr\": 0.03258630383836556,\n \"acc_norm\"\
: 0.702020202020202,\n \"acc_norm_stderr\": 0.03258630383836556\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.7979274611398963,\n \"acc_stderr\": 0.028979089794296732,\n\
\ \"acc_norm\": 0.7979274611398963,\n \"acc_norm_stderr\": 0.028979089794296732\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4846153846153846,\n \"acc_stderr\": 0.02533900301010651,\n \
\ \"acc_norm\": 0.4846153846153846,\n \"acc_norm_stderr\": 0.02533900301010651\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145668,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145668\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5546218487394958,\n \"acc_stderr\": 0.03228410626716389,\n \
\ \"acc_norm\": 0.5546218487394958,\n \"acc_norm_stderr\": 0.03228410626716389\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7394495412844037,\n \"acc_stderr\": 0.01881918203485007,\n \"\
acc_norm\": 0.7394495412844037,\n \"acc_norm_stderr\": 0.01881918203485007\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.38425925925925924,\n \"acc_stderr\": 0.03317354514310742,\n \"\
acc_norm\": 0.38425925925925924,\n \"acc_norm_stderr\": 0.03317354514310742\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7549019607843137,\n \"acc_stderr\": 0.030190282453501954,\n \"\
acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.030190282453501954\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7426160337552743,\n \"acc_stderr\": 0.02845882099146031,\n \
\ \"acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.02845882099146031\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n\
\ \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.043300437496507416,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.043300437496507416\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.044986763205729224,\n\
\ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.044986763205729224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8247863247863247,\n\
\ \"acc_stderr\": 0.024904439098918228,\n \"acc_norm\": 0.8247863247863247,\n\
\ \"acc_norm_stderr\": 0.024904439098918228\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7662835249042146,\n\
\ \"acc_stderr\": 0.01513338327898883,\n \"acc_norm\": 0.7662835249042146,\n\
\ \"acc_norm_stderr\": 0.01513338327898883\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6445086705202312,\n \"acc_stderr\": 0.025770292082977254,\n\
\ \"acc_norm\": 0.6445086705202312,\n \"acc_norm_stderr\": 0.025770292082977254\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3418994413407821,\n\
\ \"acc_stderr\": 0.01586450646160466,\n \"acc_norm\": 0.3418994413407821,\n\
\ \"acc_norm_stderr\": 0.01586450646160466\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6339869281045751,\n \"acc_stderr\": 0.027582811415159617,\n\
\ \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.027582811415159617\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6334405144694534,\n\
\ \"acc_stderr\": 0.027368078243971646,\n \"acc_norm\": 0.6334405144694534,\n\
\ \"acc_norm_stderr\": 0.027368078243971646\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6358024691358025,\n \"acc_stderr\": 0.02677492989972233,\n\
\ \"acc_norm\": 0.6358024691358025,\n \"acc_norm_stderr\": 0.02677492989972233\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40070921985815605,\n \"acc_stderr\": 0.029233465745573086,\n \
\ \"acc_norm\": 0.40070921985815605,\n \"acc_norm_stderr\": 0.029233465745573086\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.40547588005215124,\n\
\ \"acc_stderr\": 0.012539960672377202,\n \"acc_norm\": 0.40547588005215124,\n\
\ \"acc_norm_stderr\": 0.012539960672377202\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5147058823529411,\n \"acc_stderr\": 0.03035969707904612,\n\
\ \"acc_norm\": 0.5147058823529411,\n \"acc_norm_stderr\": 0.03035969707904612\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.565359477124183,\n \"acc_stderr\": 0.02005426920072646,\n \
\ \"acc_norm\": 0.565359477124183,\n \"acc_norm_stderr\": 0.02005426920072646\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n\
\ \"acc_stderr\": 0.04673752333670239,\n \"acc_norm\": 0.6090909090909091,\n\
\ \"acc_norm_stderr\": 0.04673752333670239\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.636734693877551,\n \"acc_stderr\": 0.030789051139030802,\n\
\ \"acc_norm\": 0.636734693877551,\n \"acc_norm_stderr\": 0.030789051139030802\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7412935323383084,\n\
\ \"acc_stderr\": 0.03096590312357302,\n \"acc_norm\": 0.7412935323383084,\n\
\ \"acc_norm_stderr\": 0.03096590312357302\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4457831325301205,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.4457831325301205,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.03246721765117826,\n\
\ \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.03246721765117826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2766217870257038,\n\
\ \"mc1_stderr\": 0.015659605755326923,\n \"mc2\": 0.41150494590178466,\n\
\ \"mc2_stderr\": 0.01462306768928086\n }\n}\n```"
repo_url: https://huggingface.co/The-Face-Of-Goonery/Huginn-19b-prototype
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|arc:challenge|25_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hellaswag|10_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-00-00.797867.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-00-00.797867.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T13-00-00.797867.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T13-00-00.797867.parquet'
- config_name: results
data_files:
- split: 2023_10_10T13_00_00.797867
path:
- results_2023-10-10T13-00-00.797867.parquet
- split: latest
path:
- results_2023-10-10T13-00-00.797867.parquet
---
# Dataset Card for Evaluation run of The-Face-Of-Goonery/Huginn-19b-prototype
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/The-Face-Of-Goonery/Huginn-19b-prototype
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [The-Face-Of-Goonery/Huginn-19b-prototype](https://huggingface.co/The-Face-Of-Goonery/Huginn-19b-prototype) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_The-Face-Of-Goonery__Huginn-19b-prototype",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T13:00:00.797867](https://huggingface.co/datasets/open-llm-leaderboard/details_The-Face-Of-Goonery__Huginn-19b-prototype/blob/main/results_2023-10-10T13-00-00.797867.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5582906573142512,
"acc_stderr": 0.03432283652142382,
"acc_norm": 0.5621813333085338,
"acc_norm_stderr": 0.034304199441954245,
"mc1": 0.2766217870257038,
"mc1_stderr": 0.015659605755326923,
"mc2": 0.41150494590178466,
"mc2_stderr": 0.01462306768928086
},
"harness|arc:challenge|25": {
"acc": 0.552901023890785,
"acc_stderr": 0.014529380160526843,
"acc_norm": 0.5921501706484642,
"acc_norm_stderr": 0.014361097288449703
},
"harness|hellaswag|10": {
"acc": 0.6199960167297351,
"acc_stderr": 0.004843954338451441,
"acc_norm": 0.8102967536347341,
"acc_norm_stderr": 0.003912649521823133
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5111111111111111,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.5111111111111111,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5394736842105263,
"acc_stderr": 0.04056242252249034,
"acc_norm": 0.5394736842105263,
"acc_norm_stderr": 0.04056242252249034
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6339622641509434,
"acc_stderr": 0.029647813539365245,
"acc_norm": 0.6339622641509434,
"acc_norm_stderr": 0.029647813539365245
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6041666666666666,
"acc_stderr": 0.04089465449325582,
"acc_norm": 0.6041666666666666,
"acc_norm_stderr": 0.04089465449325582
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.45,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5202312138728323,
"acc_stderr": 0.03809342081273957,
"acc_norm": 0.5202312138728323,
"acc_norm_stderr": 0.03809342081273957
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929776,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929776
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.41702127659574467,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.41702127659574467,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.024419234966819067,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.024419234966819067
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.042163702135578345,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.042163702135578345
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6451612903225806,
"acc_stderr": 0.027218889773308764,
"acc_norm": 0.6451612903225806,
"acc_norm_stderr": 0.027218889773308764
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6848484848484848,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.6848484848484848,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.702020202020202,
"acc_stderr": 0.03258630383836556,
"acc_norm": 0.702020202020202,
"acc_norm_stderr": 0.03258630383836556
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7979274611398963,
"acc_stderr": 0.028979089794296732,
"acc_norm": 0.7979274611398963,
"acc_norm_stderr": 0.028979089794296732
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4846153846153846,
"acc_stderr": 0.02533900301010651,
"acc_norm": 0.4846153846153846,
"acc_norm_stderr": 0.02533900301010651
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.027080372815145668,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.027080372815145668
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5546218487394958,
"acc_stderr": 0.03228410626716389,
"acc_norm": 0.5546218487394958,
"acc_norm_stderr": 0.03228410626716389
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7394495412844037,
"acc_stderr": 0.01881918203485007,
"acc_norm": 0.7394495412844037,
"acc_norm_stderr": 0.01881918203485007
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.38425925925925924,
"acc_stderr": 0.03317354514310742,
"acc_norm": 0.38425925925925924,
"acc_norm_stderr": 0.03317354514310742
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.030190282453501954,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.030190282453501954
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7426160337552743,
"acc_stderr": 0.02845882099146031,
"acc_norm": 0.7426160337552743,
"acc_norm_stderr": 0.02845882099146031
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.043300437496507416,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.043300437496507416
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.044986763205729224,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.044986763205729224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8247863247863247,
"acc_stderr": 0.024904439098918228,
"acc_norm": 0.8247863247863247,
"acc_norm_stderr": 0.024904439098918228
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7662835249042146,
"acc_stderr": 0.01513338327898883,
"acc_norm": 0.7662835249042146,
"acc_norm_stderr": 0.01513338327898883
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6445086705202312,
"acc_stderr": 0.025770292082977254,
"acc_norm": 0.6445086705202312,
"acc_norm_stderr": 0.025770292082977254
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3418994413407821,
"acc_stderr": 0.01586450646160466,
"acc_norm": 0.3418994413407821,
"acc_norm_stderr": 0.01586450646160466
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6339869281045751,
"acc_stderr": 0.027582811415159617,
"acc_norm": 0.6339869281045751,
"acc_norm_stderr": 0.027582811415159617
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6334405144694534,
"acc_stderr": 0.027368078243971646,
"acc_norm": 0.6334405144694534,
"acc_norm_stderr": 0.027368078243971646
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6358024691358025,
"acc_stderr": 0.02677492989972233,
"acc_norm": 0.6358024691358025,
"acc_norm_stderr": 0.02677492989972233
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40070921985815605,
"acc_stderr": 0.029233465745573086,
"acc_norm": 0.40070921985815605,
"acc_norm_stderr": 0.029233465745573086
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.40547588005215124,
"acc_stderr": 0.012539960672377202,
"acc_norm": 0.40547588005215124,
"acc_norm_stderr": 0.012539960672377202
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5147058823529411,
"acc_stderr": 0.03035969707904612,
"acc_norm": 0.5147058823529411,
"acc_norm_stderr": 0.03035969707904612
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.565359477124183,
"acc_stderr": 0.02005426920072646,
"acc_norm": 0.565359477124183,
"acc_norm_stderr": 0.02005426920072646
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.04673752333670239,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.04673752333670239
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.636734693877551,
"acc_stderr": 0.030789051139030802,
"acc_norm": 0.636734693877551,
"acc_norm_stderr": 0.030789051139030802
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7412935323383084,
"acc_stderr": 0.03096590312357302,
"acc_norm": 0.7412935323383084,
"acc_norm_stderr": 0.03096590312357302
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4457831325301205,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.4457831325301205,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.03246721765117826,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.03246721765117826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2766217870257038,
"mc1_stderr": 0.015659605755326923,
"mc2": 0.41150494590178466,
"mc2_stderr": 0.01462306768928086
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
jjzha/imdb-dutch-instruct | 2023-10-10T13:03:55.000Z | [
"size_categories:10K<n<100K",
"language:nl",
"license:apache-2.0",
"region:us"
] | jjzha | null | null | null | 0 | 0 | ---
language:
- nl
license:
- apache-2.0
size_categories:
- 10K<n<100K
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_examples: 24992
- name: test
num_examples: 24992
---
# Dataset Card for "imdb-dutch-instruct"
## Dataset Description
The original IMBD dataset was translated to Dutch with [yhavinga/ul2-large-en-nl](https://huggingface.co/yhavinga/ul2-large-en-nl).
Then, the dataset is converted to an instruct-style dataset with the following templates:
The instruction templates:
"Is deze recensie positief of negatief?",
"Wat is het sentiment van de recensie?",
"Wat voor toon heeft de volgende recensie?",
"Met wat voor sentiment zou je deze recensie beoordelen?"
The target templates:
"De recensie is",
"Gegeven de recensie, mijn antwoord is",
"Deze recensie is",
"De beoordeling hier is",
"Het antwoord is"
### Dataset Summary
Large Movie Review Dataset translated to Dutch converted to instruct style.
This is a dataset for sentiment classification containing substantially more data than previous benchmark datasets.
### Languages and Example
This dataset contains Dutch data.
An example of 'train' looks as follows.
```
{
"inputs": "Is deze recensie positief of negatief?\n\nIk heb alle vier de films in deze serie gezien. Elke film wijkt steeds verder af van de boeken. Deze is de ergste tot nu toe. Mijn probleem is dat hij op geen enkele manier het boek volgt waar hij naar genoemd is! De regisseurs en producenten hadden hem een andere naam moeten geven dan 'Love's Abiding Joy'. Het enige aan deze film dat ook maar in de verte op het boek lijkt, zijn de namen van sommige personages (Willie, Missie, Henry, Clark, Scottie en Cookie). De namen/ouders/verzorgers van de kinderen kloppen niet. De hele verhaallijn staat nergens in het boek. '<br />Ik vind het een grote belediging voor Janette Oke, haar boeken en haar fans om een film onder haar titel te produceren die in geen enkel opzicht correct is. De muziek is te hard. De acteurs zijn niet overtuigend <0xE2><0x80><0x93> ze missen emoties.<br />Als je een goede familiefilm wilt, is dit misschien goed. Het is schoon. Maar kijk er niet naar, als je hoopt op een verkorte versie van het boek. Ik hoop dat dit de laatste film uit deze serie zal zijn, maar ik betwijfel het. Als er meer films worden gemaakt, zou ik willen dat Michael Landon jr. en anderen dichter bij de oorspronkelijke plot en verhaallijn zouden blijven. De boeken zijn uitstekend en als je ze goed leest, zijn het uitstekende films!",
"targets": "Het antwoord is negatief."}
```
### Data Fields
The data fields are the same among all splits.
#### plain_text
- `inputs`: a `string` feature, starting with a question whether the review is positive or negative.
- `targets`: a `string` feature, with a template prefix and the final label .
### Data Splits
| name |train|test |
|----------|----:|----:|
|plain_text|24992|24992|
### Official Citation Information
The original data is from here: https://huggingface.co/datasets/yhavinga/imdb_dutch
```
@InProceedings{maas-EtAl:2011:ACL-HLT2011,
author = {Maas, Andrew L. and Daly, Raymond E. and Pham, Peter T. and Huang, Dan and Ng, Andrew Y. and Potts, Christopher},
title = {Learning Word Vectors for Sentiment Analysis},
booktitle = {Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies},
month = {June},
year = {2011},
address = {Portland, Oregon, USA},
publisher = {Association for Computational Linguistics},
pages = {142--150},
url = {http://www.aclweb.org/anthology/P11-1015}
}
```
Created by [Mike Zhang](https://jjzha.github.io/)
|
open-llm-leaderboard/details_Yukang__Llama-2-7b-longlora-32k-ft | 2023-10-10T13:04:36.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Yukang/Llama-2-7b-longlora-32k-ft
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Yukang/Llama-2-7b-longlora-32k-ft](https://huggingface.co/Yukang/Llama-2-7b-longlora-32k-ft)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Yukang__Llama-2-7b-longlora-32k-ft\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T13:03:11.726005](https://huggingface.co/datasets/open-llm-leaderboard/details_Yukang__Llama-2-7b-longlora-32k-ft/blob/main/results_2023-10-10T13-03-11.726005.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23095342099711472,\n\
\ \"acc_stderr\": 0.03068798097484314,\n \"acc_norm\": 0.23205178409700405,\n\
\ \"acc_norm_stderr\": 0.03070714755069479,\n \"mc1\": 0.2386780905752754,\n\
\ \"mc1_stderr\": 0.014922629695456418,\n \"mc2\": 0.4957068563437588,\n\
\ \"mc2_stderr\": 0.016914945574930968\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.21331058020477817,\n \"acc_stderr\": 0.011970971742326334,\n\
\ \"acc_norm\": 0.2790102389078498,\n \"acc_norm_stderr\": 0.013106784883601341\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2570205138418642,\n\
\ \"acc_stderr\": 0.004360977256058745,\n \"acc_norm\": 0.2561242780322645,\n\
\ \"acc_norm_stderr\": 0.004355992090030995\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n\
\ \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n\
\ \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n\
\ \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n\
\ \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"\
acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.19704433497536947,\n \"acc_stderr\": 0.02798672466673621,\n \"\
acc_norm\": 0.19704433497536947,\n \"acc_norm_stderr\": 0.02798672466673621\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n\
\ \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.02665353159671549,\n\
\ \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.02665353159671549\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"\
acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"\
acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n\
\ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n\
\ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.17959183673469387,\n\
\ \"acc_stderr\": 0.024573293589585637,\n \"acc_norm\": 0.17959183673469387,\n\
\ \"acc_norm_stderr\": 0.024573293589585637\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n\
\ \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n\
\ \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n\
\ \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.2386780905752754,\n \"mc1_stderr\": 0.014922629695456418,\n\
\ \"mc2\": 0.4957068563437588,\n \"mc2_stderr\": 0.016914945574930968\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Yukang/Llama-2-7b-longlora-32k-ft
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|arc:challenge|25_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hellaswag|10_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-03-11.726005.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-03-11.726005.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T13-03-11.726005.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T13-03-11.726005.parquet'
- config_name: results
data_files:
- split: 2023_10_10T13_03_11.726005
path:
- results_2023-10-10T13-03-11.726005.parquet
- split: latest
path:
- results_2023-10-10T13-03-11.726005.parquet
---
# Dataset Card for Evaluation run of Yukang/Llama-2-7b-longlora-32k-ft
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Yukang/Llama-2-7b-longlora-32k-ft
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Yukang/Llama-2-7b-longlora-32k-ft](https://huggingface.co/Yukang/Llama-2-7b-longlora-32k-ft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Yukang__Llama-2-7b-longlora-32k-ft",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T13:03:11.726005](https://huggingface.co/datasets/open-llm-leaderboard/details_Yukang__Llama-2-7b-longlora-32k-ft/blob/main/results_2023-10-10T13-03-11.726005.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23095342099711472,
"acc_stderr": 0.03068798097484314,
"acc_norm": 0.23205178409700405,
"acc_norm_stderr": 0.03070714755069479,
"mc1": 0.2386780905752754,
"mc1_stderr": 0.014922629695456418,
"mc2": 0.4957068563437588,
"mc2_stderr": 0.016914945574930968
},
"harness|arc:challenge|25": {
"acc": 0.21331058020477817,
"acc_stderr": 0.011970971742326334,
"acc_norm": 0.2790102389078498,
"acc_norm_stderr": 0.013106784883601341
},
"harness|hellaswag|10": {
"acc": 0.2570205138418642,
"acc_stderr": 0.004360977256058745,
"acc_norm": 0.2561242780322645,
"acc_norm_stderr": 0.004355992090030995
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.19704433497536947,
"acc_stderr": 0.02798672466673621,
"acc_norm": 0.19704433497536947,
"acc_norm_stderr": 0.02798672466673621
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.02665353159671549,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.02665353159671549
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17959183673469387,
"acc_stderr": 0.024573293589585637,
"acc_norm": 0.17959183673469387,
"acc_norm_stderr": 0.024573293589585637
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2386780905752754,
"mc1_stderr": 0.014922629695456418,
"mc2": 0.4957068563437588,
"mc2_stderr": 0.016914945574930968
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Envoid__Libra-19B | 2023-10-10T13:08:10.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Envoid/Libra-19B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Envoid/Libra-19B](https://huggingface.co/Envoid/Libra-19B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Envoid__Libra-19B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T13:06:44.906506](https://huggingface.co/datasets/open-llm-leaderboard/details_Envoid__Libra-19B/blob/main/results_2023-10-10T13-06-44.906506.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5569293553820306,\n\
\ \"acc_stderr\": 0.034449414411067046,\n \"acc_norm\": 0.5610225705423245,\n\
\ \"acc_norm_stderr\": 0.034428764182003746,\n \"mc1\": 0.3317013463892289,\n\
\ \"mc1_stderr\": 0.016482148810241477,\n \"mc2\": 0.4841022865255407,\n\
\ \"mc2_stderr\": 0.014988563695820389\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5665529010238908,\n \"acc_stderr\": 0.014481376224558898,\n\
\ \"acc_norm\": 0.60580204778157,\n \"acc_norm_stderr\": 0.014280522667467325\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6181039633539136,\n\
\ \"acc_stderr\": 0.00484858324360668,\n \"acc_norm\": 0.8203545110535749,\n\
\ \"acc_norm_stderr\": 0.0038310732859630774\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5333333333333333,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.5333333333333333,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5460526315789473,\n \"acc_stderr\": 0.04051646342874143,\n\
\ \"acc_norm\": 0.5460526315789473,\n \"acc_norm_stderr\": 0.04051646342874143\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5962264150943396,\n \"acc_stderr\": 0.03019761160019795,\n\
\ \"acc_norm\": 0.5962264150943396,\n \"acc_norm_stderr\": 0.03019761160019795\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5763888888888888,\n\
\ \"acc_stderr\": 0.04132125019723369,\n \"acc_norm\": 0.5763888888888888,\n\
\ \"acc_norm_stderr\": 0.04132125019723369\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5260115606936416,\n\
\ \"acc_stderr\": 0.038073017265045125,\n \"acc_norm\": 0.5260115606936416,\n\
\ \"acc_norm_stderr\": 0.038073017265045125\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.04560480215720685,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.04560480215720685\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.43829787234042555,\n \"acc_stderr\": 0.03243618636108102,\n\
\ \"acc_norm\": 0.43829787234042555,\n \"acc_norm_stderr\": 0.03243618636108102\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.04339138322579861,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.04339138322579861\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.31746031746031744,\n \"acc_stderr\": 0.02397386199899208,\n \"\
acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.02397386199899208\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n\
\ \"acc_stderr\": 0.04240799327574924,\n \"acc_norm\": 0.3412698412698413,\n\
\ \"acc_norm_stderr\": 0.04240799327574924\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.667741935483871,\n\
\ \"acc_stderr\": 0.02679556084812281,\n \"acc_norm\": 0.667741935483871,\n\
\ \"acc_norm_stderr\": 0.02679556084812281\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3842364532019704,\n \"acc_stderr\": 0.034223985656575494,\n\
\ \"acc_norm\": 0.3842364532019704,\n \"acc_norm_stderr\": 0.034223985656575494\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n\
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6848484848484848,\n \"acc_stderr\": 0.036277305750224094,\n\
\ \"acc_norm\": 0.6848484848484848,\n \"acc_norm_stderr\": 0.036277305750224094\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6919191919191919,\n \"acc_stderr\": 0.032894773300986155,\n \"\
acc_norm\": 0.6919191919191919,\n \"acc_norm_stderr\": 0.032894773300986155\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7927461139896373,\n \"acc_stderr\": 0.029252823291803644,\n\
\ \"acc_norm\": 0.7927461139896373,\n \"acc_norm_stderr\": 0.029252823291803644\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5153846153846153,\n \"acc_stderr\": 0.025339003010106515,\n\
\ \"acc_norm\": 0.5153846153846153,\n \"acc_norm_stderr\": 0.025339003010106515\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608466,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608466\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5378151260504201,\n \"acc_stderr\": 0.03238546948758979,\n \
\ \"acc_norm\": 0.5378151260504201,\n \"acc_norm_stderr\": 0.03238546948758979\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658754,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658754\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7229357798165138,\n \"acc_stderr\": 0.01918848259016953,\n \"\
acc_norm\": 0.7229357798165138,\n \"acc_norm_stderr\": 0.01918848259016953\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4398148148148148,\n \"acc_stderr\": 0.03385177976044812,\n \"\
acc_norm\": 0.4398148148148148,\n \"acc_norm_stderr\": 0.03385177976044812\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7745098039215687,\n \"acc_stderr\": 0.029331162294251735,\n \"\
acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.029331162294251735\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7215189873417721,\n \"acc_stderr\": 0.029178682304842548,\n \
\ \"acc_norm\": 0.7215189873417721,\n \"acc_norm_stderr\": 0.029178682304842548\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n\
\ \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591207,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591207\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.656441717791411,\n \"acc_stderr\": 0.037311335196738925,\n\
\ \"acc_norm\": 0.656441717791411,\n \"acc_norm_stderr\": 0.037311335196738925\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280041,\n\
\ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280041\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8076923076923077,\n\
\ \"acc_stderr\": 0.02581923325648371,\n \"acc_norm\": 0.8076923076923077,\n\
\ \"acc_norm_stderr\": 0.02581923325648371\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7471264367816092,\n\
\ \"acc_stderr\": 0.015543377313719683,\n \"acc_norm\": 0.7471264367816092,\n\
\ \"acc_norm_stderr\": 0.015543377313719683\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.025992472029306397,\n\
\ \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.025992472029306397\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40558659217877097,\n\
\ \"acc_stderr\": 0.01642167050633919,\n \"acc_norm\": 0.40558659217877097,\n\
\ \"acc_norm_stderr\": 0.01642167050633919\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6045751633986928,\n \"acc_stderr\": 0.027996723180631462,\n\
\ \"acc_norm\": 0.6045751633986928,\n \"acc_norm_stderr\": 0.027996723180631462\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6302250803858521,\n\
\ \"acc_stderr\": 0.027417996705630995,\n \"acc_norm\": 0.6302250803858521,\n\
\ \"acc_norm_stderr\": 0.027417996705630995\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6141975308641975,\n \"acc_stderr\": 0.027085401226132143,\n\
\ \"acc_norm\": 0.6141975308641975,\n \"acc_norm_stderr\": 0.027085401226132143\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40425531914893614,\n \"acc_stderr\": 0.029275532159704732,\n \
\ \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.029275532159704732\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4211212516297262,\n\
\ \"acc_stderr\": 0.012610325733489905,\n \"acc_norm\": 0.4211212516297262,\n\
\ \"acc_norm_stderr\": 0.012610325733489905\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5551470588235294,\n \"acc_stderr\": 0.03018753206032938,\n\
\ \"acc_norm\": 0.5551470588235294,\n \"acc_norm_stderr\": 0.03018753206032938\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5702614379084967,\n \"acc_stderr\": 0.020027122784928544,\n \
\ \"acc_norm\": 0.5702614379084967,\n \"acc_norm_stderr\": 0.020027122784928544\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6244897959183674,\n \"acc_stderr\": 0.03100120903989484,\n\
\ \"acc_norm\": 0.6244897959183674,\n \"acc_norm_stderr\": 0.03100120903989484\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7263681592039801,\n\
\ \"acc_stderr\": 0.031524391865554016,\n \"acc_norm\": 0.7263681592039801,\n\
\ \"acc_norm_stderr\": 0.031524391865554016\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.4759036144578313,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n\
\ \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3317013463892289,\n\
\ \"mc1_stderr\": 0.016482148810241477,\n \"mc2\": 0.4841022865255407,\n\
\ \"mc2_stderr\": 0.014988563695820389\n }\n}\n```"
repo_url: https://huggingface.co/Envoid/Libra-19B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|arc:challenge|25_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hellaswag|10_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-06-44.906506.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-06-44.906506.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T13-06-44.906506.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T13-06-44.906506.parquet'
- config_name: results
data_files:
- split: 2023_10_10T13_06_44.906506
path:
- results_2023-10-10T13-06-44.906506.parquet
- split: latest
path:
- results_2023-10-10T13-06-44.906506.parquet
---
# Dataset Card for Evaluation run of Envoid/Libra-19B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Envoid/Libra-19B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Envoid/Libra-19B](https://huggingface.co/Envoid/Libra-19B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Envoid__Libra-19B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T13:06:44.906506](https://huggingface.co/datasets/open-llm-leaderboard/details_Envoid__Libra-19B/blob/main/results_2023-10-10T13-06-44.906506.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5569293553820306,
"acc_stderr": 0.034449414411067046,
"acc_norm": 0.5610225705423245,
"acc_norm_stderr": 0.034428764182003746,
"mc1": 0.3317013463892289,
"mc1_stderr": 0.016482148810241477,
"mc2": 0.4841022865255407,
"mc2_stderr": 0.014988563695820389
},
"harness|arc:challenge|25": {
"acc": 0.5665529010238908,
"acc_stderr": 0.014481376224558898,
"acc_norm": 0.60580204778157,
"acc_norm_stderr": 0.014280522667467325
},
"harness|hellaswag|10": {
"acc": 0.6181039633539136,
"acc_stderr": 0.00484858324360668,
"acc_norm": 0.8203545110535749,
"acc_norm_stderr": 0.0038310732859630774
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5333333333333333,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.5333333333333333,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5460526315789473,
"acc_stderr": 0.04051646342874143,
"acc_norm": 0.5460526315789473,
"acc_norm_stderr": 0.04051646342874143
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5962264150943396,
"acc_stderr": 0.03019761160019795,
"acc_norm": 0.5962264150943396,
"acc_norm_stderr": 0.03019761160019795
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5763888888888888,
"acc_stderr": 0.04132125019723369,
"acc_norm": 0.5763888888888888,
"acc_norm_stderr": 0.04132125019723369
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5260115606936416,
"acc_stderr": 0.038073017265045125,
"acc_norm": 0.5260115606936416,
"acc_norm_stderr": 0.038073017265045125
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720685,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720685
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.43829787234042555,
"acc_stderr": 0.03243618636108102,
"acc_norm": 0.43829787234042555,
"acc_norm_stderr": 0.03243618636108102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.04339138322579861,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.04339138322579861
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.02397386199899208,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.02397386199899208
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.04240799327574924,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.04240799327574924
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.667741935483871,
"acc_stderr": 0.02679556084812281,
"acc_norm": 0.667741935483871,
"acc_norm_stderr": 0.02679556084812281
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3842364532019704,
"acc_stderr": 0.034223985656575494,
"acc_norm": 0.3842364532019704,
"acc_norm_stderr": 0.034223985656575494
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6848484848484848,
"acc_stderr": 0.036277305750224094,
"acc_norm": 0.6848484848484848,
"acc_norm_stderr": 0.036277305750224094
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6919191919191919,
"acc_stderr": 0.032894773300986155,
"acc_norm": 0.6919191919191919,
"acc_norm_stderr": 0.032894773300986155
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7927461139896373,
"acc_stderr": 0.029252823291803644,
"acc_norm": 0.7927461139896373,
"acc_norm_stderr": 0.029252823291803644
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5153846153846153,
"acc_stderr": 0.025339003010106515,
"acc_norm": 0.5153846153846153,
"acc_norm_stderr": 0.025339003010106515
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.028406533090608466,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.028406533090608466
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5378151260504201,
"acc_stderr": 0.03238546948758979,
"acc_norm": 0.5378151260504201,
"acc_norm_stderr": 0.03238546948758979
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658754,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658754
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7229357798165138,
"acc_stderr": 0.01918848259016953,
"acc_norm": 0.7229357798165138,
"acc_norm_stderr": 0.01918848259016953
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4398148148148148,
"acc_stderr": 0.03385177976044812,
"acc_norm": 0.4398148148148148,
"acc_norm_stderr": 0.03385177976044812
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.029331162294251735,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.029331162294251735
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7215189873417721,
"acc_stderr": 0.029178682304842548,
"acc_norm": 0.7215189873417721,
"acc_norm_stderr": 0.029178682304842548
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591207,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591207
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.656441717791411,
"acc_stderr": 0.037311335196738925,
"acc_norm": 0.656441717791411,
"acc_norm_stderr": 0.037311335196738925
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280041,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280041
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8076923076923077,
"acc_stderr": 0.02581923325648371,
"acc_norm": 0.8076923076923077,
"acc_norm_stderr": 0.02581923325648371
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7471264367816092,
"acc_stderr": 0.015543377313719683,
"acc_norm": 0.7471264367816092,
"acc_norm_stderr": 0.015543377313719683
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.025992472029306397,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.025992472029306397
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.40558659217877097,
"acc_stderr": 0.01642167050633919,
"acc_norm": 0.40558659217877097,
"acc_norm_stderr": 0.01642167050633919
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6045751633986928,
"acc_stderr": 0.027996723180631462,
"acc_norm": 0.6045751633986928,
"acc_norm_stderr": 0.027996723180631462
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6302250803858521,
"acc_stderr": 0.027417996705630995,
"acc_norm": 0.6302250803858521,
"acc_norm_stderr": 0.027417996705630995
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6141975308641975,
"acc_stderr": 0.027085401226132143,
"acc_norm": 0.6141975308641975,
"acc_norm_stderr": 0.027085401226132143
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40425531914893614,
"acc_stderr": 0.029275532159704732,
"acc_norm": 0.40425531914893614,
"acc_norm_stderr": 0.029275532159704732
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4211212516297262,
"acc_stderr": 0.012610325733489905,
"acc_norm": 0.4211212516297262,
"acc_norm_stderr": 0.012610325733489905
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5551470588235294,
"acc_stderr": 0.03018753206032938,
"acc_norm": 0.5551470588235294,
"acc_norm_stderr": 0.03018753206032938
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5702614379084967,
"acc_stderr": 0.020027122784928544,
"acc_norm": 0.5702614379084967,
"acc_norm_stderr": 0.020027122784928544
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6244897959183674,
"acc_stderr": 0.03100120903989484,
"acc_norm": 0.6244897959183674,
"acc_norm_stderr": 0.03100120903989484
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7263681592039801,
"acc_stderr": 0.031524391865554016,
"acc_norm": 0.7263681592039801,
"acc_norm_stderr": 0.031524391865554016
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3317013463892289,
"mc1_stderr": 0.016482148810241477,
"mc2": 0.4841022865255407,
"mc2_stderr": 0.014988563695820389
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Yukang__Llama-2-7b-longlora-16k-ft | 2023-10-10T13:10:14.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Yukang/Llama-2-7b-longlora-16k-ft
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Yukang/Llama-2-7b-longlora-16k-ft](https://huggingface.co/Yukang/Llama-2-7b-longlora-16k-ft)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Yukang__Llama-2-7b-longlora-16k-ft\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T13:08:49.738155](https://huggingface.co/datasets/open-llm-leaderboard/details_Yukang__Llama-2-7b-longlora-16k-ft/blob/main/results_2023-10-10T13-08-49.738155.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2371847317799568,\n\
\ \"acc_stderr\": 0.030901112237362246,\n \"acc_norm\": 0.23836292100308282,\n\
\ \"acc_norm_stderr\": 0.03092162608694053,\n \"mc1\": 0.2215422276621787,\n\
\ \"mc1_stderr\": 0.01453786760130114,\n \"mc2\": 0.4775777270955148,\n\
\ \"mc2_stderr\": 0.016623850534886544\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.20051194539249148,\n \"acc_stderr\": 0.011700318050499377,\n\
\ \"acc_norm\": 0.2636518771331058,\n \"acc_norm_stderr\": 0.01287592915129706\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25731925911173076,\n\
\ \"acc_stderr\": 0.004362633637374482,\n \"acc_norm\": 0.2636924915355507,\n\
\ \"acc_norm_stderr\": 0.004397339661695462\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2,\n \
\ \"acc_stderr\": 0.03455473702325436,\n \"acc_norm\": 0.2,\n \"\
acc_norm_stderr\": 0.03455473702325436\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.20754716981132076,\n \"acc_stderr\": 0.02495991802891127,\n\
\ \"acc_norm\": 0.20754716981132076,\n \"acc_norm_stderr\": 0.02495991802891127\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"\
acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"\
acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2076923076923077,\n \"acc_stderr\": 0.020567539567246797,\n\
\ \"acc_norm\": 0.2076923076923077,\n \"acc_norm_stderr\": 0.020567539567246797\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"\
acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.23529411764705882,\n\
\ \"acc_stderr\": 0.029771775228145638,\n \"acc_norm\": 0.23529411764705882,\n\
\ \"acc_norm_stderr\": 0.029771775228145638\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3094170403587444,\n\
\ \"acc_stderr\": 0.031024411740572203,\n \"acc_norm\": 0.3094170403587444,\n\
\ \"acc_norm_stderr\": 0.031024411740572203\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.29914529914529914,\n\
\ \"acc_stderr\": 0.029996951858349497,\n \"acc_norm\": 0.29914529914529914,\n\
\ \"acc_norm_stderr\": 0.029996951858349497\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23499361430395913,\n\
\ \"acc_stderr\": 0.015162024152278441,\n \"acc_norm\": 0.23499361430395913,\n\
\ \"acc_norm_stderr\": 0.015162024152278441\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.02352924218519311,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.02352924218519311\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.22857142857142856,\n\
\ \"acc_stderr\": 0.026882144922307748,\n \"acc_norm\": 0.22857142857142856,\n\
\ \"acc_norm_stderr\": 0.026882144922307748\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n\
\ \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n\
\ \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n\
\ \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.2215422276621787,\n \"mc1_stderr\": 0.01453786760130114,\n\
\ \"mc2\": 0.4775777270955148,\n \"mc2_stderr\": 0.016623850534886544\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Yukang/Llama-2-7b-longlora-16k-ft
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|arc:challenge|25_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hellaswag|10_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-08-49.738155.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-08-49.738155.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T13-08-49.738155.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T13-08-49.738155.parquet'
- config_name: results
data_files:
- split: 2023_10_10T13_08_49.738155
path:
- results_2023-10-10T13-08-49.738155.parquet
- split: latest
path:
- results_2023-10-10T13-08-49.738155.parquet
---
# Dataset Card for Evaluation run of Yukang/Llama-2-7b-longlora-16k-ft
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Yukang/Llama-2-7b-longlora-16k-ft
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Yukang/Llama-2-7b-longlora-16k-ft](https://huggingface.co/Yukang/Llama-2-7b-longlora-16k-ft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Yukang__Llama-2-7b-longlora-16k-ft",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T13:08:49.738155](https://huggingface.co/datasets/open-llm-leaderboard/details_Yukang__Llama-2-7b-longlora-16k-ft/blob/main/results_2023-10-10T13-08-49.738155.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2371847317799568,
"acc_stderr": 0.030901112237362246,
"acc_norm": 0.23836292100308282,
"acc_norm_stderr": 0.03092162608694053,
"mc1": 0.2215422276621787,
"mc1_stderr": 0.01453786760130114,
"mc2": 0.4775777270955148,
"mc2_stderr": 0.016623850534886544
},
"harness|arc:challenge|25": {
"acc": 0.20051194539249148,
"acc_stderr": 0.011700318050499377,
"acc_norm": 0.2636518771331058,
"acc_norm_stderr": 0.01287592915129706
},
"harness|hellaswag|10": {
"acc": 0.25731925911173076,
"acc_stderr": 0.004362633637374482,
"acc_norm": 0.2636924915355507,
"acc_norm_stderr": 0.004397339661695462
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2,
"acc_stderr": 0.03455473702325436,
"acc_norm": 0.2,
"acc_norm_stderr": 0.03455473702325436
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.20754716981132076,
"acc_stderr": 0.02495991802891127,
"acc_norm": 0.20754716981132076,
"acc_norm_stderr": 0.02495991802891127
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2076923076923077,
"acc_stderr": 0.020567539567246797,
"acc_norm": 0.2076923076923077,
"acc_norm_stderr": 0.020567539567246797
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.029771775228145638,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.029771775228145638
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3094170403587444,
"acc_stderr": 0.031024411740572203,
"acc_norm": 0.3094170403587444,
"acc_norm_stderr": 0.031024411740572203
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.29914529914529914,
"acc_stderr": 0.029996951858349497,
"acc_norm": 0.29914529914529914,
"acc_norm_stderr": 0.029996951858349497
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23499361430395913,
"acc_stderr": 0.015162024152278441,
"acc_norm": 0.23499361430395913,
"acc_norm_stderr": 0.015162024152278441
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.02352924218519311,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.02352924218519311
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.22857142857142856,
"acc_stderr": 0.026882144922307748,
"acc_norm": 0.22857142857142856,
"acc_norm_stderr": 0.026882144922307748
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2215422276621787,
"mc1_stderr": 0.01453786760130114,
"mc2": 0.4775777270955148,
"mc2_stderr": 0.016623850534886544
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
laion/wuerstchen-dataset | 2023-10-10T21:09:01.000Z | [
"region:us"
] | laion | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: caption
dtype: string
- name: image
dtype: image
- name: link
dtype: string
- name: message_id
dtype: string
- name: timestamp
dtype: string
splits:
- name: train
num_bytes: 1106567952.0
num_examples: 879
download_size: 1105539069
dataset_size: 1106567952.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "wuerstchen-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Envoid__Yousei-22B | 2023-10-10T13:11:06.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Envoid/Yousei-22B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Envoid/Yousei-22B](https://huggingface.co/Envoid/Yousei-22B) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Envoid__Yousei-22B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T13:09:41.852615](https://huggingface.co/datasets/open-llm-leaderboard/details_Envoid__Yousei-22B/blob/main/results_2023-10-10T13-09-41.852615.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5239286386600809,\n\
\ \"acc_stderr\": 0.034881101408338694,\n \"acc_norm\": 0.5281493624790682,\n\
\ \"acc_norm_stderr\": 0.03486547766870961,\n \"mc1\": 0.3243574051407589,\n\
\ \"mc1_stderr\": 0.01638797677964794,\n \"mc2\": 0.5067888730518536,\n\
\ \"mc2_stderr\": 0.015949761865278096\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5127986348122867,\n \"acc_stderr\": 0.014606603181012538,\n\
\ \"acc_norm\": 0.5588737201365188,\n \"acc_norm_stderr\": 0.014509747749064664\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5825532762397929,\n\
\ \"acc_stderr\": 0.004921300331285574,\n \"acc_norm\": 0.7855008962358097,\n\
\ \"acc_norm_stderr\": 0.004096355125117513\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4934210526315789,\n \"acc_stderr\": 0.040685900502249704,\n\
\ \"acc_norm\": 0.4934210526315789,\n \"acc_norm_stderr\": 0.040685900502249704\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5924528301886792,\n \"acc_stderr\": 0.03024223380085449,\n\
\ \"acc_norm\": 0.5924528301886792,\n \"acc_norm_stderr\": 0.03024223380085449\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5416666666666666,\n\
\ \"acc_stderr\": 0.04166666666666665,\n \"acc_norm\": 0.5416666666666666,\n\
\ \"acc_norm_stderr\": 0.04166666666666665\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.47398843930635837,\n\
\ \"acc_stderr\": 0.038073017265045125,\n \"acc_norm\": 0.47398843930635837,\n\
\ \"acc_norm_stderr\": 0.038073017265045125\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.04576665403207762,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.04576665403207762\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.42127659574468085,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.42127659574468085,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4413793103448276,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.4413793103448276,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30952380952380953,\n \"acc_stderr\": 0.023809523809523864,\n \"\
acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.023809523809523864\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n\
\ \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n\
\ \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6451612903225806,\n\
\ \"acc_stderr\": 0.02721888977330877,\n \"acc_norm\": 0.6451612903225806,\n\
\ \"acc_norm_stderr\": 0.02721888977330877\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3891625615763547,\n \"acc_stderr\": 0.034304624161038716,\n\
\ \"acc_norm\": 0.3891625615763547,\n \"acc_norm_stderr\": 0.034304624161038716\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.037131580674819135,\n\
\ \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.037131580674819135\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6464646464646465,\n \"acc_stderr\": 0.03406086723547155,\n \"\
acc_norm\": 0.6464646464646465,\n \"acc_norm_stderr\": 0.03406086723547155\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7202072538860104,\n \"acc_stderr\": 0.03239637046735704,\n\
\ \"acc_norm\": 0.7202072538860104,\n \"acc_norm_stderr\": 0.03239637046735704\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.47435897435897434,\n \"acc_stderr\": 0.025317649726448663,\n\
\ \"acc_norm\": 0.47435897435897434,\n \"acc_norm_stderr\": 0.025317649726448663\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085622,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085622\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5336134453781513,\n \"acc_stderr\": 0.03240501447690071,\n \
\ \"acc_norm\": 0.5336134453781513,\n \"acc_norm_stderr\": 0.03240501447690071\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7009174311926606,\n \"acc_stderr\": 0.019630417285415182,\n \"\
acc_norm\": 0.7009174311926606,\n \"acc_norm_stderr\": 0.019630417285415182\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3472222222222222,\n \"acc_stderr\": 0.032468872436376486,\n \"\
acc_norm\": 0.3472222222222222,\n \"acc_norm_stderr\": 0.032468872436376486\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7058823529411765,\n \"acc_stderr\": 0.03198001660115071,\n \"\
acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.03198001660115071\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7088607594936709,\n \"acc_stderr\": 0.029571601065753374,\n \
\ \"acc_norm\": 0.7088607594936709,\n \"acc_norm_stderr\": 0.029571601065753374\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6143497757847534,\n\
\ \"acc_stderr\": 0.03266842214289201,\n \"acc_norm\": 0.6143497757847534,\n\
\ \"acc_norm_stderr\": 0.03266842214289201\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5572519083969466,\n \"acc_stderr\": 0.0435644720266507,\n\
\ \"acc_norm\": 0.5572519083969466,\n \"acc_norm_stderr\": 0.0435644720266507\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6859504132231405,\n \"acc_stderr\": 0.04236964753041018,\n \"\
acc_norm\": 0.6859504132231405,\n \"acc_norm_stderr\": 0.04236964753041018\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n\
\ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n\
\ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6380368098159509,\n \"acc_stderr\": 0.037757007291414416,\n\
\ \"acc_norm\": 0.6380368098159509,\n \"acc_norm_stderr\": 0.037757007291414416\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5728155339805825,\n \"acc_stderr\": 0.04897957737781168,\n\
\ \"acc_norm\": 0.5728155339805825,\n \"acc_norm_stderr\": 0.04897957737781168\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n\
\ \"acc_stderr\": 0.02624677294689048,\n \"acc_norm\": 0.7991452991452992,\n\
\ \"acc_norm_stderr\": 0.02624677294689048\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7203065134099617,\n\
\ \"acc_stderr\": 0.016050792148036536,\n \"acc_norm\": 0.7203065134099617,\n\
\ \"acc_norm_stderr\": 0.016050792148036536\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5809248554913294,\n \"acc_stderr\": 0.02656417811142262,\n\
\ \"acc_norm\": 0.5809248554913294,\n \"acc_norm_stderr\": 0.02656417811142262\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37206703910614525,\n\
\ \"acc_stderr\": 0.016165847583563292,\n \"acc_norm\": 0.37206703910614525,\n\
\ \"acc_norm_stderr\": 0.016165847583563292\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.028431095444176647,\n\
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.028431095444176647\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6077170418006431,\n\
\ \"acc_stderr\": 0.02773125864701199,\n \"acc_norm\": 0.6077170418006431,\n\
\ \"acc_norm_stderr\": 0.02773125864701199\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6358024691358025,\n \"acc_stderr\": 0.026774929899722327,\n\
\ \"acc_norm\": 0.6358024691358025,\n \"acc_norm_stderr\": 0.026774929899722327\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3617021276595745,\n \"acc_stderr\": 0.0286638201471995,\n \
\ \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.0286638201471995\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4074315514993481,\n\
\ \"acc_stderr\": 0.012549473714212226,\n \"acc_norm\": 0.4074315514993481,\n\
\ \"acc_norm_stderr\": 0.012549473714212226\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4852941176470588,\n \"acc_stderr\": 0.03035969707904612,\n\
\ \"acc_norm\": 0.4852941176470588,\n \"acc_norm_stderr\": 0.03035969707904612\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5016339869281046,\n \"acc_stderr\": 0.020227726838150117,\n \
\ \"acc_norm\": 0.5016339869281046,\n \"acc_norm_stderr\": 0.020227726838150117\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6612244897959184,\n \"acc_stderr\": 0.030299506562154188,\n\
\ \"acc_norm\": 0.6612244897959184,\n \"acc_norm_stderr\": 0.030299506562154188\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6368159203980099,\n\
\ \"acc_stderr\": 0.034005985055990146,\n \"acc_norm\": 0.6368159203980099,\n\
\ \"acc_norm_stderr\": 0.034005985055990146\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4457831325301205,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.4457831325301205,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7543859649122807,\n \"acc_stderr\": 0.0330140594698725,\n\
\ \"acc_norm\": 0.7543859649122807,\n \"acc_norm_stderr\": 0.0330140594698725\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3243574051407589,\n\
\ \"mc1_stderr\": 0.01638797677964794,\n \"mc2\": 0.5067888730518536,\n\
\ \"mc2_stderr\": 0.015949761865278096\n }\n}\n```"
repo_url: https://huggingface.co/Envoid/Yousei-22B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|arc:challenge|25_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hellaswag|10_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-09-41.852615.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-09-41.852615.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T13-09-41.852615.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T13-09-41.852615.parquet'
- config_name: results
data_files:
- split: 2023_10_10T13_09_41.852615
path:
- results_2023-10-10T13-09-41.852615.parquet
- split: latest
path:
- results_2023-10-10T13-09-41.852615.parquet
---
# Dataset Card for Evaluation run of Envoid/Yousei-22B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Envoid/Yousei-22B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Envoid/Yousei-22B](https://huggingface.co/Envoid/Yousei-22B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Envoid__Yousei-22B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T13:09:41.852615](https://huggingface.co/datasets/open-llm-leaderboard/details_Envoid__Yousei-22B/blob/main/results_2023-10-10T13-09-41.852615.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5239286386600809,
"acc_stderr": 0.034881101408338694,
"acc_norm": 0.5281493624790682,
"acc_norm_stderr": 0.03486547766870961,
"mc1": 0.3243574051407589,
"mc1_stderr": 0.01638797677964794,
"mc2": 0.5067888730518536,
"mc2_stderr": 0.015949761865278096
},
"harness|arc:challenge|25": {
"acc": 0.5127986348122867,
"acc_stderr": 0.014606603181012538,
"acc_norm": 0.5588737201365188,
"acc_norm_stderr": 0.014509747749064664
},
"harness|hellaswag|10": {
"acc": 0.5825532762397929,
"acc_stderr": 0.004921300331285574,
"acc_norm": 0.7855008962358097,
"acc_norm_stderr": 0.004096355125117513
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4934210526315789,
"acc_stderr": 0.040685900502249704,
"acc_norm": 0.4934210526315789,
"acc_norm_stderr": 0.040685900502249704
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5924528301886792,
"acc_stderr": 0.03024223380085449,
"acc_norm": 0.5924528301886792,
"acc_norm_stderr": 0.03024223380085449
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.04166666666666665,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.04166666666666665
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.47398843930635837,
"acc_stderr": 0.038073017265045125,
"acc_norm": 0.47398843930635837,
"acc_norm_stderr": 0.038073017265045125
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.04576665403207762,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.04576665403207762
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.42127659574468085,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.42127659574468085,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4413793103448276,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.4413793103448276,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.023809523809523864,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.023809523809523864
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6451612903225806,
"acc_stderr": 0.02721888977330877,
"acc_norm": 0.6451612903225806,
"acc_norm_stderr": 0.02721888977330877
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3891625615763547,
"acc_stderr": 0.034304624161038716,
"acc_norm": 0.3891625615763547,
"acc_norm_stderr": 0.034304624161038716
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.037131580674819135,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.037131580674819135
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6464646464646465,
"acc_stderr": 0.03406086723547155,
"acc_norm": 0.6464646464646465,
"acc_norm_stderr": 0.03406086723547155
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7202072538860104,
"acc_stderr": 0.03239637046735704,
"acc_norm": 0.7202072538860104,
"acc_norm_stderr": 0.03239637046735704
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.47435897435897434,
"acc_stderr": 0.025317649726448663,
"acc_norm": 0.47435897435897434,
"acc_norm_stderr": 0.025317649726448663
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085622,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085622
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5336134453781513,
"acc_stderr": 0.03240501447690071,
"acc_norm": 0.5336134453781513,
"acc_norm_stderr": 0.03240501447690071
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7009174311926606,
"acc_stderr": 0.019630417285415182,
"acc_norm": 0.7009174311926606,
"acc_norm_stderr": 0.019630417285415182
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3472222222222222,
"acc_stderr": 0.032468872436376486,
"acc_norm": 0.3472222222222222,
"acc_norm_stderr": 0.032468872436376486
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.03198001660115071,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.03198001660115071
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7088607594936709,
"acc_stderr": 0.029571601065753374,
"acc_norm": 0.7088607594936709,
"acc_norm_stderr": 0.029571601065753374
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6143497757847534,
"acc_stderr": 0.03266842214289201,
"acc_norm": 0.6143497757847534,
"acc_norm_stderr": 0.03266842214289201
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5572519083969466,
"acc_stderr": 0.0435644720266507,
"acc_norm": 0.5572519083969466,
"acc_norm_stderr": 0.0435644720266507
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6859504132231405,
"acc_stderr": 0.04236964753041018,
"acc_norm": 0.6859504132231405,
"acc_norm_stderr": 0.04236964753041018
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6380368098159509,
"acc_stderr": 0.037757007291414416,
"acc_norm": 0.6380368098159509,
"acc_norm_stderr": 0.037757007291414416
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.5728155339805825,
"acc_stderr": 0.04897957737781168,
"acc_norm": 0.5728155339805825,
"acc_norm_stderr": 0.04897957737781168
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.02624677294689048,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.02624677294689048
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7203065134099617,
"acc_stderr": 0.016050792148036536,
"acc_norm": 0.7203065134099617,
"acc_norm_stderr": 0.016050792148036536
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5809248554913294,
"acc_stderr": 0.02656417811142262,
"acc_norm": 0.5809248554913294,
"acc_norm_stderr": 0.02656417811142262
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37206703910614525,
"acc_stderr": 0.016165847583563292,
"acc_norm": 0.37206703910614525,
"acc_norm_stderr": 0.016165847583563292
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.028431095444176647,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.028431095444176647
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6077170418006431,
"acc_stderr": 0.02773125864701199,
"acc_norm": 0.6077170418006431,
"acc_norm_stderr": 0.02773125864701199
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6358024691358025,
"acc_stderr": 0.026774929899722327,
"acc_norm": 0.6358024691358025,
"acc_norm_stderr": 0.026774929899722327
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3617021276595745,
"acc_stderr": 0.0286638201471995,
"acc_norm": 0.3617021276595745,
"acc_norm_stderr": 0.0286638201471995
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4074315514993481,
"acc_stderr": 0.012549473714212226,
"acc_norm": 0.4074315514993481,
"acc_norm_stderr": 0.012549473714212226
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4852941176470588,
"acc_stderr": 0.03035969707904612,
"acc_norm": 0.4852941176470588,
"acc_norm_stderr": 0.03035969707904612
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5016339869281046,
"acc_stderr": 0.020227726838150117,
"acc_norm": 0.5016339869281046,
"acc_norm_stderr": 0.020227726838150117
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6612244897959184,
"acc_stderr": 0.030299506562154188,
"acc_norm": 0.6612244897959184,
"acc_norm_stderr": 0.030299506562154188
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6368159203980099,
"acc_stderr": 0.034005985055990146,
"acc_norm": 0.6368159203980099,
"acc_norm_stderr": 0.034005985055990146
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4457831325301205,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.4457831325301205,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7543859649122807,
"acc_stderr": 0.0330140594698725,
"acc_norm": 0.7543859649122807,
"acc_norm_stderr": 0.0330140594698725
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3243574051407589,
"mc1_stderr": 0.01638797677964794,
"mc2": 0.5067888730518536,
"mc2_stderr": 0.015949761865278096
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
twdent/HikingHD | 2023-10-10T14:05:24.000Z | [
"region:us"
] | twdent | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 515009990.0
num_examples: 38
download_size: 159208907
dataset_size: 515009990.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "HikingHD"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HamdanXI/daily_dialog_text_to_gloss_final | 2023-10-10T13:31:14.000Z | [
"region:us"
] | HamdanXI | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
- name: gloss
dtype: string
splits:
- name: train
num_bytes: 6048869
num_examples: 75415
download_size: 3960195
dataset_size: 6048869
---
# Dataset Card for "daily_dialog_text_to_gloss_final"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Yukang__Llama-2-13b-longlora-32k-ft | 2023-10-10T13:27:39.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Yukang/Llama-2-13b-longlora-32k-ft
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Yukang/Llama-2-13b-longlora-32k-ft](https://huggingface.co/Yukang/Llama-2-13b-longlora-32k-ft)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Yukang__Llama-2-13b-longlora-32k-ft\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T13:26:13.835261](https://huggingface.co/datasets/open-llm-leaderboard/details_Yukang__Llama-2-13b-longlora-32k-ft/blob/main/results_2023-10-10T13-26-13.835261.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5235522320689743,\n\
\ \"acc_stderr\": 0.035038668557256,\n \"acc_norm\": 0.527753200254576,\n\
\ \"acc_norm_stderr\": 0.03501756688265017,\n \"mc1\": 0.24724602203182375,\n\
\ \"mc1_stderr\": 0.01510240479735965,\n \"mc2\": 0.37438135497816827,\n\
\ \"mc2_stderr\": 0.013759468601775139\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5511945392491467,\n \"acc_stderr\": 0.014534599585097664,\n\
\ \"acc_norm\": 0.5947098976109215,\n \"acc_norm_stderr\": 0.014346869060229321\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6217884883489345,\n\
\ \"acc_stderr\": 0.004839497020536613,\n \"acc_norm\": 0.8261302529376618,\n\
\ \"acc_norm_stderr\": 0.003782228743661059\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.42962962962962964,\n\
\ \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.42962962962962964,\n\
\ \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4934210526315789,\n \"acc_stderr\": 0.040685900502249704,\n\
\ \"acc_norm\": 0.4934210526315789,\n \"acc_norm_stderr\": 0.040685900502249704\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5471698113207547,\n \"acc_stderr\": 0.030635627957961823,\n\
\ \"acc_norm\": 0.5471698113207547,\n \"acc_norm_stderr\": 0.030635627957961823\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5277777777777778,\n\
\ \"acc_stderr\": 0.04174752578923185,\n \"acc_norm\": 0.5277777777777778,\n\
\ \"acc_norm_stderr\": 0.04174752578923185\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.49710982658959535,\n\
\ \"acc_stderr\": 0.03812400565974833,\n \"acc_norm\": 0.49710982658959535,\n\
\ \"acc_norm_stderr\": 0.03812400565974833\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105653,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105653\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.04164188720169377,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.04164188720169377\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.34656084656084657,\n \"acc_stderr\": 0.024508777521028435,\n \"\
acc_norm\": 0.34656084656084657,\n \"acc_norm_stderr\": 0.024508777521028435\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.04073524322147126,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.04073524322147126\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5903225806451613,\n \"acc_stderr\": 0.027976054915347364,\n \"\
acc_norm\": 0.5903225806451613,\n \"acc_norm_stderr\": 0.027976054915347364\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.39408866995073893,\n \"acc_stderr\": 0.03438157967036544,\n \"\
acc_norm\": 0.39408866995073893,\n \"acc_norm_stderr\": 0.03438157967036544\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6181818181818182,\n \"acc_stderr\": 0.03793713171165633,\n\
\ \"acc_norm\": 0.6181818181818182,\n \"acc_norm_stderr\": 0.03793713171165633\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6616161616161617,\n \"acc_stderr\": 0.033711241426263014,\n \"\
acc_norm\": 0.6616161616161617,\n \"acc_norm_stderr\": 0.033711241426263014\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7150259067357513,\n \"acc_stderr\": 0.03257714077709662,\n\
\ \"acc_norm\": 0.7150259067357513,\n \"acc_norm_stderr\": 0.03257714077709662\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.48205128205128206,\n \"acc_stderr\": 0.025334667080954935,\n\
\ \"acc_norm\": 0.48205128205128206,\n \"acc_norm_stderr\": 0.025334667080954935\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340503,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340503\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5336134453781513,\n \"acc_stderr\": 0.03240501447690071,\n \
\ \"acc_norm\": 0.5336134453781513,\n \"acc_norm_stderr\": 0.03240501447690071\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6752293577981652,\n \"acc_stderr\": 0.020077729109310327,\n \"\
acc_norm\": 0.6752293577981652,\n \"acc_norm_stderr\": 0.020077729109310327\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.38425925925925924,\n \"acc_stderr\": 0.03317354514310742,\n \"\
acc_norm\": 0.38425925925925924,\n \"acc_norm_stderr\": 0.03317354514310742\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6813725490196079,\n \"acc_stderr\": 0.032702871814820816,\n \"\
acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.032702871814820816\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.70042194092827,\n \"acc_stderr\": 0.02981802474975309,\n \
\ \"acc_norm\": 0.70042194092827,\n \"acc_norm_stderr\": 0.02981802474975309\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6053811659192825,\n\
\ \"acc_stderr\": 0.03280400504755291,\n \"acc_norm\": 0.6053811659192825,\n\
\ \"acc_norm_stderr\": 0.03280400504755291\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.549618320610687,\n \"acc_stderr\": 0.04363643698524779,\n\
\ \"acc_norm\": 0.549618320610687,\n \"acc_norm_stderr\": 0.04363643698524779\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04750077341199985,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04750077341199985\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6380368098159509,\n \"acc_stderr\": 0.037757007291414416,\n\
\ \"acc_norm\": 0.6380368098159509,\n \"acc_norm_stderr\": 0.037757007291414416\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280042,\n\
\ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280042\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7564102564102564,\n\
\ \"acc_stderr\": 0.02812096650391441,\n \"acc_norm\": 0.7564102564102564,\n\
\ \"acc_norm_stderr\": 0.02812096650391441\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488583\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7049808429118773,\n\
\ \"acc_stderr\": 0.016308363772932728,\n \"acc_norm\": 0.7049808429118773,\n\
\ \"acc_norm_stderr\": 0.016308363772932728\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5953757225433526,\n \"acc_stderr\": 0.026424816594009845,\n\
\ \"acc_norm\": 0.5953757225433526,\n \"acc_norm_stderr\": 0.026424816594009845\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27932960893854747,\n\
\ \"acc_stderr\": 0.015005762446786164,\n \"acc_norm\": 0.27932960893854747,\n\
\ \"acc_norm_stderr\": 0.015005762446786164\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.027914055510467998,\n\
\ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.027914055510467998\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5884244372990354,\n\
\ \"acc_stderr\": 0.027950481494401273,\n \"acc_norm\": 0.5884244372990354,\n\
\ \"acc_norm_stderr\": 0.027950481494401273\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6265432098765432,\n \"acc_stderr\": 0.02691500301138016,\n\
\ \"acc_norm\": 0.6265432098765432,\n \"acc_norm_stderr\": 0.02691500301138016\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.36524822695035464,\n \"acc_stderr\": 0.028723863853281278,\n \
\ \"acc_norm\": 0.36524822695035464,\n \"acc_norm_stderr\": 0.028723863853281278\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.40221642764015647,\n\
\ \"acc_stderr\": 0.012523646856180178,\n \"acc_norm\": 0.40221642764015647,\n\
\ \"acc_norm_stderr\": 0.012523646856180178\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.44485294117647056,\n \"acc_stderr\": 0.03018753206032939,\n\
\ \"acc_norm\": 0.44485294117647056,\n \"acc_norm_stderr\": 0.03018753206032939\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5473856209150327,\n \"acc_stderr\": 0.020136790918492534,\n \
\ \"acc_norm\": 0.5473856209150327,\n \"acc_norm_stderr\": 0.020136790918492534\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n\
\ \"acc_stderr\": 0.04673752333670237,\n \"acc_norm\": 0.6090909090909091,\n\
\ \"acc_norm_stderr\": 0.04673752333670237\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5795918367346938,\n \"acc_stderr\": 0.03160106993449601,\n\
\ \"acc_norm\": 0.5795918367346938,\n \"acc_norm_stderr\": 0.03160106993449601\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6567164179104478,\n\
\ \"acc_stderr\": 0.03357379665433431,\n \"acc_norm\": 0.6567164179104478,\n\
\ \"acc_norm_stderr\": 0.03357379665433431\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n\
\ \"acc_stderr\": 0.038743715565879536,\n \"acc_norm\": 0.45180722891566266,\n\
\ \"acc_norm_stderr\": 0.038743715565879536\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7368421052631579,\n \"acc_stderr\": 0.033773102522092056,\n\
\ \"acc_norm\": 0.7368421052631579,\n \"acc_norm_stderr\": 0.033773102522092056\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24724602203182375,\n\
\ \"mc1_stderr\": 0.01510240479735965,\n \"mc2\": 0.37438135497816827,\n\
\ \"mc2_stderr\": 0.013759468601775139\n }\n}\n```"
repo_url: https://huggingface.co/Yukang/Llama-2-13b-longlora-32k-ft
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|arc:challenge|25_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hellaswag|10_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-26-13.835261.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-26-13.835261.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T13-26-13.835261.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T13-26-13.835261.parquet'
- config_name: results
data_files:
- split: 2023_10_10T13_26_13.835261
path:
- results_2023-10-10T13-26-13.835261.parquet
- split: latest
path:
- results_2023-10-10T13-26-13.835261.parquet
---
# Dataset Card for Evaluation run of Yukang/Llama-2-13b-longlora-32k-ft
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Yukang/Llama-2-13b-longlora-32k-ft
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Yukang/Llama-2-13b-longlora-32k-ft](https://huggingface.co/Yukang/Llama-2-13b-longlora-32k-ft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Yukang__Llama-2-13b-longlora-32k-ft",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T13:26:13.835261](https://huggingface.co/datasets/open-llm-leaderboard/details_Yukang__Llama-2-13b-longlora-32k-ft/blob/main/results_2023-10-10T13-26-13.835261.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5235522320689743,
"acc_stderr": 0.035038668557256,
"acc_norm": 0.527753200254576,
"acc_norm_stderr": 0.03501756688265017,
"mc1": 0.24724602203182375,
"mc1_stderr": 0.01510240479735965,
"mc2": 0.37438135497816827,
"mc2_stderr": 0.013759468601775139
},
"harness|arc:challenge|25": {
"acc": 0.5511945392491467,
"acc_stderr": 0.014534599585097664,
"acc_norm": 0.5947098976109215,
"acc_norm_stderr": 0.014346869060229321
},
"harness|hellaswag|10": {
"acc": 0.6217884883489345,
"acc_stderr": 0.004839497020536613,
"acc_norm": 0.8261302529376618,
"acc_norm_stderr": 0.003782228743661059
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.42962962962962964,
"acc_stderr": 0.04276349494376599,
"acc_norm": 0.42962962962962964,
"acc_norm_stderr": 0.04276349494376599
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4934210526315789,
"acc_stderr": 0.040685900502249704,
"acc_norm": 0.4934210526315789,
"acc_norm_stderr": 0.040685900502249704
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5471698113207547,
"acc_stderr": 0.030635627957961823,
"acc_norm": 0.5471698113207547,
"acc_norm_stderr": 0.030635627957961823
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.04174752578923185,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.04174752578923185
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.49710982658959535,
"acc_stderr": 0.03812400565974833,
"acc_norm": 0.49710982658959535,
"acc_norm_stderr": 0.03812400565974833
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105653,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105653
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.04164188720169377,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.04164188720169377
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.34656084656084657,
"acc_stderr": 0.024508777521028435,
"acc_norm": 0.34656084656084657,
"acc_norm_stderr": 0.024508777521028435
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.04073524322147126,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.04073524322147126
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5903225806451613,
"acc_stderr": 0.027976054915347364,
"acc_norm": 0.5903225806451613,
"acc_norm_stderr": 0.027976054915347364
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.39408866995073893,
"acc_stderr": 0.03438157967036544,
"acc_norm": 0.39408866995073893,
"acc_norm_stderr": 0.03438157967036544
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.03793713171165633,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.03793713171165633
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6616161616161617,
"acc_stderr": 0.033711241426263014,
"acc_norm": 0.6616161616161617,
"acc_norm_stderr": 0.033711241426263014
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7150259067357513,
"acc_stderr": 0.03257714077709662,
"acc_norm": 0.7150259067357513,
"acc_norm_stderr": 0.03257714077709662
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.48205128205128206,
"acc_stderr": 0.025334667080954935,
"acc_norm": 0.48205128205128206,
"acc_norm_stderr": 0.025334667080954935
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340503,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340503
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5336134453781513,
"acc_stderr": 0.03240501447690071,
"acc_norm": 0.5336134453781513,
"acc_norm_stderr": 0.03240501447690071
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6752293577981652,
"acc_stderr": 0.020077729109310327,
"acc_norm": 0.6752293577981652,
"acc_norm_stderr": 0.020077729109310327
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.38425925925925924,
"acc_stderr": 0.03317354514310742,
"acc_norm": 0.38425925925925924,
"acc_norm_stderr": 0.03317354514310742
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.032702871814820816,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.032702871814820816
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.70042194092827,
"acc_stderr": 0.02981802474975309,
"acc_norm": 0.70042194092827,
"acc_norm_stderr": 0.02981802474975309
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6053811659192825,
"acc_stderr": 0.03280400504755291,
"acc_norm": 0.6053811659192825,
"acc_norm_stderr": 0.03280400504755291
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.549618320610687,
"acc_stderr": 0.04363643698524779,
"acc_norm": 0.549618320610687,
"acc_norm_stderr": 0.04363643698524779
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04750077341199985,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04750077341199985
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6380368098159509,
"acc_stderr": 0.037757007291414416,
"acc_norm": 0.6380368098159509,
"acc_norm_stderr": 0.037757007291414416
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280042,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280042
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7564102564102564,
"acc_stderr": 0.02812096650391441,
"acc_norm": 0.7564102564102564,
"acc_norm_stderr": 0.02812096650391441
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7049808429118773,
"acc_stderr": 0.016308363772932728,
"acc_norm": 0.7049808429118773,
"acc_norm_stderr": 0.016308363772932728
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.026424816594009845,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.026424816594009845
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27932960893854747,
"acc_stderr": 0.015005762446786164,
"acc_norm": 0.27932960893854747,
"acc_norm_stderr": 0.015005762446786164
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.027914055510467998,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.027914055510467998
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5884244372990354,
"acc_stderr": 0.027950481494401273,
"acc_norm": 0.5884244372990354,
"acc_norm_stderr": 0.027950481494401273
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6265432098765432,
"acc_stderr": 0.02691500301138016,
"acc_norm": 0.6265432098765432,
"acc_norm_stderr": 0.02691500301138016
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36524822695035464,
"acc_stderr": 0.028723863853281278,
"acc_norm": 0.36524822695035464,
"acc_norm_stderr": 0.028723863853281278
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.40221642764015647,
"acc_stderr": 0.012523646856180178,
"acc_norm": 0.40221642764015647,
"acc_norm_stderr": 0.012523646856180178
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.44485294117647056,
"acc_stderr": 0.03018753206032939,
"acc_norm": 0.44485294117647056,
"acc_norm_stderr": 0.03018753206032939
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5473856209150327,
"acc_stderr": 0.020136790918492534,
"acc_norm": 0.5473856209150327,
"acc_norm_stderr": 0.020136790918492534
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.04673752333670237,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.04673752333670237
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5795918367346938,
"acc_stderr": 0.03160106993449601,
"acc_norm": 0.5795918367346938,
"acc_norm_stderr": 0.03160106993449601
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6567164179104478,
"acc_stderr": 0.03357379665433431,
"acc_norm": 0.6567164179104478,
"acc_norm_stderr": 0.03357379665433431
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.038743715565879536,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.038743715565879536
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7368421052631579,
"acc_stderr": 0.033773102522092056,
"acc_norm": 0.7368421052631579,
"acc_norm_stderr": 0.033773102522092056
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24724602203182375,
"mc1_stderr": 0.01510240479735965,
"mc2": 0.37438135497816827,
"mc2_stderr": 0.013759468601775139
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Yukang__Llama-2-13b-longlora-16k-ft | 2023-10-10T13:34:18.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Yukang/Llama-2-13b-longlora-16k-ft
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Yukang/Llama-2-13b-longlora-16k-ft](https://huggingface.co/Yukang/Llama-2-13b-longlora-16k-ft)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Yukang__Llama-2-13b-longlora-16k-ft\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T13:32:51.379088](https://huggingface.co/datasets/open-llm-leaderboard/details_Yukang__Llama-2-13b-longlora-16k-ft/blob/main/results_2023-10-10T13-32-51.379088.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2309514868511923,\n\
\ \"acc_stderr\": 0.03069747797345808,\n \"acc_norm\": 0.23225844916267432,\n\
\ \"acc_norm_stderr\": 0.030718445749202464,\n \"mc1\": 0.23623011015911874,\n\
\ \"mc1_stderr\": 0.014869755015871088,\n \"mc2\": 0.48893302050171134,\n\
\ \"mc2_stderr\": 0.016667601604993615\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.19795221843003413,\n \"acc_stderr\": 0.011643990971573391,\n\
\ \"acc_norm\": 0.25853242320819114,\n \"acc_norm_stderr\": 0.012794553754288679\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25951005775741887,\n\
\ \"acc_stderr\": 0.004374699189284862,\n \"acc_norm\": 0.27604062935670187,\n\
\ \"acc_norm_stderr\": 0.004461235175488313\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n\
\ \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n\
\ \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n\
\ \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"\
acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"\
acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n\
\ \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"\
acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"\
acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841043,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841043\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n\
\ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n\
\ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21296296296296297,\n \"acc_stderr\": 0.0227797190887334,\n\
\ \"acc_norm\": 0.21296296296296297,\n \"acc_norm_stderr\": 0.0227797190887334\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.19183673469387755,\n\
\ \"acc_stderr\": 0.025206963154225395,\n \"acc_norm\": 0.19183673469387755,\n\
\ \"acc_norm_stderr\": 0.025206963154225395\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n\
\ \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n\
\ \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n\
\ \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.23623011015911874,\n \"mc1_stderr\": 0.014869755015871088,\n\
\ \"mc2\": 0.48893302050171134,\n \"mc2_stderr\": 0.016667601604993615\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Yukang/Llama-2-13b-longlora-16k-ft
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|arc:challenge|25_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hellaswag|10_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-32-51.379088.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T13-32-51.379088.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T13-32-51.379088.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T13-32-51.379088.parquet'
- config_name: results
data_files:
- split: 2023_10_10T13_32_51.379088
path:
- results_2023-10-10T13-32-51.379088.parquet
- split: latest
path:
- results_2023-10-10T13-32-51.379088.parquet
---
# Dataset Card for Evaluation run of Yukang/Llama-2-13b-longlora-16k-ft
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Yukang/Llama-2-13b-longlora-16k-ft
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Yukang/Llama-2-13b-longlora-16k-ft](https://huggingface.co/Yukang/Llama-2-13b-longlora-16k-ft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Yukang__Llama-2-13b-longlora-16k-ft",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T13:32:51.379088](https://huggingface.co/datasets/open-llm-leaderboard/details_Yukang__Llama-2-13b-longlora-16k-ft/blob/main/results_2023-10-10T13-32-51.379088.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2309514868511923,
"acc_stderr": 0.03069747797345808,
"acc_norm": 0.23225844916267432,
"acc_norm_stderr": 0.030718445749202464,
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871088,
"mc2": 0.48893302050171134,
"mc2_stderr": 0.016667601604993615
},
"harness|arc:challenge|25": {
"acc": 0.19795221843003413,
"acc_stderr": 0.011643990971573391,
"acc_norm": 0.25853242320819114,
"acc_norm_stderr": 0.012794553754288679
},
"harness|hellaswag|10": {
"acc": 0.25951005775741887,
"acc_stderr": 0.004374699189284862,
"acc_norm": 0.27604062935670187,
"acc_norm_stderr": 0.004461235175488313
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841043,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841043
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.0227797190887334,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.0227797190887334
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.19183673469387755,
"acc_stderr": 0.025206963154225395,
"acc_norm": 0.19183673469387755,
"acc_norm_stderr": 0.025206963154225395
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871088,
"mc2": 0.48893302050171134,
"mc2_stderr": 0.016667601604993615
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
SADATO/furniture-keytoad | 2023-10-10T13:51:58.000Z | [
"region:us"
] | SADATO | null | null | null | 0 | 0 | Entry not found |
Abira1/finance | 2023-10-10T14:20:12.000Z | [
"region:us"
] | Abira1 | null | null | null | 0 | 0 | Entry not found |
amin-nejad/EuroSat | 2023-10-10T13:45:16.000Z | [
"region:us"
] | amin-nejad | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': AnnualCrop
'1': Forest
'2': HerbaceousVegetation
'3': Highway
'4': Industrial
'5': Pasture
'6': PermanentCrop
'7': Residential
'8': River
'9': SeaLake
splits:
- name: train
num_bytes: 83171379.6
num_examples: 24300
download_size: 82782583
dataset_size: 83171379.6
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "EuroSat"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HamdanXI/gloss_to_text | 2023-10-10T13:59:03.000Z | [
"region:us"
] | HamdanXI | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: gloss
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 18221089
num_examples: 145572
download_size: 10879299
dataset_size: 18221089
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "gloss_to_text"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rpii2023/lallalala | 2023-10-10T13:52:32.000Z | [
"region:us"
] | rpii2023 | null | null | null | 0 | 0 | Entry not found |
FlippNipper/freddouglas | 2023-10-10T14:03:22.000Z | [
"region:us"
] | FlippNipper | null | null | null | 0 | 0 | Entry not found |
Lancelot53/bengali_ai_ipa | 2023-10-10T14:07:56.000Z | [
"region:us"
] | Lancelot53 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
- name: ipa
dtype: string
- name: row_id_column_name
dtype: int64
splits:
- name: train
num_bytes: 6974634
num_examples: 21999
- name: test
num_bytes: 5861099
num_examples: 27228
download_size: 6174391
dataset_size: 12835733
---
# Dataset Card for "bengali_ai_ipa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Coroseven/MarinKitagawa | 2023-10-10T14:02:01.000Z | [
"region:us"
] | Coroseven | null | null | null | 0 | 0 | Entry not found |
nalmeida/test_local1 | 2023-10-10T14:03:19.000Z | [
"region:us"
] | nalmeida | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_Community-LM__llava-v1.5-13b-hf | 2023-10-10T14:03:00.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Community-LM/llava-v1.5-13b-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Community-LM/llava-v1.5-13b-hf](https://huggingface.co/Community-LM/llava-v1.5-13b-hf)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Community-LM__llava-v1.5-13b-hf\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T14:01:34.065508](https://huggingface.co/datasets/open-llm-leaderboard/details_Community-LM__llava-v1.5-13b-hf/blob/main/results_2023-10-10T14-01-34.065508.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5687974861474466,\n\
\ \"acc_stderr\": 0.034102420636387375,\n \"acc_norm\": 0.5727205361494934,\n\
\ \"acc_norm_stderr\": 0.034085436281331656,\n \"mc1\": 0.3011015911872705,\n\
\ \"mc1_stderr\": 0.016058999026100612,\n \"mc2\": 0.433460825483405,\n\
\ \"mc2_stderr\": 0.01517244922847158\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5324232081911263,\n \"acc_stderr\": 0.01458063756999542,\n\
\ \"acc_norm\": 0.5614334470989761,\n \"acc_norm_stderr\": 0.014500682618212864\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6011750647281418,\n\
\ \"acc_stderr\": 0.004886559008754983,\n \"acc_norm\": 0.8036247759410476,\n\
\ \"acc_norm_stderr\": 0.003964437012249994\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5855263157894737,\n \"acc_stderr\": 0.04008973785779206,\n\
\ \"acc_norm\": 0.5855263157894737,\n \"acc_norm_stderr\": 0.04008973785779206\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.67,\n\
\ \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n \
\ \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6037735849056604,\n \"acc_stderr\": 0.030102793781791197,\n\
\ \"acc_norm\": 0.6037735849056604,\n \"acc_norm_stderr\": 0.030102793781791197\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6041666666666666,\n\
\ \"acc_stderr\": 0.04089465449325582,\n \"acc_norm\": 0.6041666666666666,\n\
\ \"acc_norm_stderr\": 0.04089465449325582\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n\
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5375722543352601,\n\
\ \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.5375722543352601,\n\
\ \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.043898699568087764,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.043898699568087764\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.44680851063829785,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.328042328042328,\n \"acc_stderr\": 0.0241804971643769,\n \"acc_norm\"\
: 0.328042328042328,\n \"acc_norm_stderr\": 0.0241804971643769\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7129032258064516,\n\
\ \"acc_stderr\": 0.025736542745594528,\n \"acc_norm\": 0.7129032258064516,\n\
\ \"acc_norm_stderr\": 0.025736542745594528\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.03481904844438803,\n\
\ \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.03481904844438803\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n\
\ \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7424242424242424,\n \"acc_stderr\": 0.031156269519646836,\n \"\
acc_norm\": 0.7424242424242424,\n \"acc_norm_stderr\": 0.031156269519646836\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.026499057701397433,\n\
\ \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.026499057701397433\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5384615384615384,\n \"acc_stderr\": 0.025275892070240644,\n\
\ \"acc_norm\": 0.5384615384615384,\n \"acc_norm_stderr\": 0.025275892070240644\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066475,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066475\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5672268907563025,\n \"acc_stderr\": 0.032183581077426124,\n\
\ \"acc_norm\": 0.5672268907563025,\n \"acc_norm_stderr\": 0.032183581077426124\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.26490066225165565,\n \"acc_stderr\": 0.03603038545360384,\n \"\
acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.03603038545360384\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7577981651376147,\n \"acc_stderr\": 0.018368176306598618,\n \"\
acc_norm\": 0.7577981651376147,\n \"acc_norm_stderr\": 0.018368176306598618\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.03372343271653063,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.03372343271653063\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7549019607843137,\n \"acc_stderr\": 0.030190282453501947,\n \"\
acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.030190282453501947\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \
\ \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.03160295143776678,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.03160295143776678\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n\
\ \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.71900826446281,\n \"acc_stderr\": 0.04103203830514512,\n \"acc_norm\"\
: 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514512\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6319018404907976,\n \"acc_stderr\": 0.03789213935838396,\n\
\ \"acc_norm\": 0.6319018404907976,\n \"acc_norm_stderr\": 0.03789213935838396\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384493,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384493\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n\
\ \"acc_stderr\": 0.02363687331748928,\n \"acc_norm\": 0.8461538461538461,\n\
\ \"acc_norm_stderr\": 0.02363687331748928\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7739463601532567,\n\
\ \"acc_stderr\": 0.014957458504335835,\n \"acc_norm\": 0.7739463601532567,\n\
\ \"acc_norm_stderr\": 0.014957458504335835\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6271676300578035,\n \"acc_stderr\": 0.02603389061357628,\n\
\ \"acc_norm\": 0.6271676300578035,\n \"acc_norm_stderr\": 0.02603389061357628\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3240223463687151,\n\
\ \"acc_stderr\": 0.015652542496421114,\n \"acc_norm\": 0.3240223463687151,\n\
\ \"acc_norm_stderr\": 0.015652542496421114\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6078431372549019,\n \"acc_stderr\": 0.027956046165424523,\n\
\ \"acc_norm\": 0.6078431372549019,\n \"acc_norm_stderr\": 0.027956046165424523\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6237942122186495,\n\
\ \"acc_stderr\": 0.02751392568354943,\n \"acc_norm\": 0.6237942122186495,\n\
\ \"acc_norm_stderr\": 0.02751392568354943\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6975308641975309,\n \"acc_stderr\": 0.025557653981868045,\n\
\ \"acc_norm\": 0.6975308641975309,\n \"acc_norm_stderr\": 0.025557653981868045\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4078014184397163,\n \"acc_stderr\": 0.029316011776343555,\n \
\ \"acc_norm\": 0.4078014184397163,\n \"acc_norm_stderr\": 0.029316011776343555\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41590612777053454,\n\
\ \"acc_stderr\": 0.012588323850313608,\n \"acc_norm\": 0.41590612777053454,\n\
\ \"acc_norm_stderr\": 0.012588323850313608\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5477941176470589,\n \"acc_stderr\": 0.030233758551596445,\n\
\ \"acc_norm\": 0.5477941176470589,\n \"acc_norm_stderr\": 0.030233758551596445\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5784313725490197,\n \"acc_stderr\": 0.019977422600227477,\n \
\ \"acc_norm\": 0.5784313725490197,\n \"acc_norm_stderr\": 0.019977422600227477\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6530612244897959,\n \"acc_stderr\": 0.030472526026726496,\n\
\ \"acc_norm\": 0.6530612244897959,\n \"acc_norm_stderr\": 0.030472526026726496\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7611940298507462,\n\
\ \"acc_stderr\": 0.03014777593540922,\n \"acc_norm\": 0.7611940298507462,\n\
\ \"acc_norm_stderr\": 0.03014777593540922\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3011015911872705,\n\
\ \"mc1_stderr\": 0.016058999026100612,\n \"mc2\": 0.433460825483405,\n\
\ \"mc2_stderr\": 0.01517244922847158\n }\n}\n```"
repo_url: https://huggingface.co/Community-LM/llava-v1.5-13b-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|arc:challenge|25_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hellaswag|10_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-01-34.065508.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-01-34.065508.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T14-01-34.065508.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T14-01-34.065508.parquet'
- config_name: results
data_files:
- split: 2023_10_10T14_01_34.065508
path:
- results_2023-10-10T14-01-34.065508.parquet
- split: latest
path:
- results_2023-10-10T14-01-34.065508.parquet
---
# Dataset Card for Evaluation run of Community-LM/llava-v1.5-13b-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Community-LM/llava-v1.5-13b-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Community-LM/llava-v1.5-13b-hf](https://huggingface.co/Community-LM/llava-v1.5-13b-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Community-LM__llava-v1.5-13b-hf",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T14:01:34.065508](https://huggingface.co/datasets/open-llm-leaderboard/details_Community-LM__llava-v1.5-13b-hf/blob/main/results_2023-10-10T14-01-34.065508.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5687974861474466,
"acc_stderr": 0.034102420636387375,
"acc_norm": 0.5727205361494934,
"acc_norm_stderr": 0.034085436281331656,
"mc1": 0.3011015911872705,
"mc1_stderr": 0.016058999026100612,
"mc2": 0.433460825483405,
"mc2_stderr": 0.01517244922847158
},
"harness|arc:challenge|25": {
"acc": 0.5324232081911263,
"acc_stderr": 0.01458063756999542,
"acc_norm": 0.5614334470989761,
"acc_norm_stderr": 0.014500682618212864
},
"harness|hellaswag|10": {
"acc": 0.6011750647281418,
"acc_stderr": 0.004886559008754983,
"acc_norm": 0.8036247759410476,
"acc_norm_stderr": 0.003964437012249994
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5855263157894737,
"acc_stderr": 0.04008973785779206,
"acc_norm": 0.5855263157894737,
"acc_norm_stderr": 0.04008973785779206
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6037735849056604,
"acc_stderr": 0.030102793781791197,
"acc_norm": 0.6037735849056604,
"acc_norm_stderr": 0.030102793781791197
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6041666666666666,
"acc_stderr": 0.04089465449325582,
"acc_norm": 0.6041666666666666,
"acc_norm_stderr": 0.04089465449325582
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5375722543352601,
"acc_stderr": 0.0380168510452446,
"acc_norm": 0.5375722543352601,
"acc_norm_stderr": 0.0380168510452446
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.043898699568087764,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.043898699568087764
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.328042328042328,
"acc_stderr": 0.0241804971643769,
"acc_norm": 0.328042328042328,
"acc_norm_stderr": 0.0241804971643769
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7129032258064516,
"acc_stderr": 0.025736542745594528,
"acc_norm": 0.7129032258064516,
"acc_norm_stderr": 0.025736542745594528
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.03481904844438803,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.03481904844438803
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.03524390844511781,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.03524390844511781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7424242424242424,
"acc_stderr": 0.031156269519646836,
"acc_norm": 0.7424242424242424,
"acc_norm_stderr": 0.031156269519646836
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8393782383419689,
"acc_stderr": 0.026499057701397433,
"acc_norm": 0.8393782383419689,
"acc_norm_stderr": 0.026499057701397433
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5384615384615384,
"acc_stderr": 0.025275892070240644,
"acc_norm": 0.5384615384615384,
"acc_norm_stderr": 0.025275892070240644
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066475,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066475
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5672268907563025,
"acc_stderr": 0.032183581077426124,
"acc_norm": 0.5672268907563025,
"acc_norm_stderr": 0.032183581077426124
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.03603038545360384,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.03603038545360384
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7577981651376147,
"acc_stderr": 0.018368176306598618,
"acc_norm": 0.7577981651376147,
"acc_norm_stderr": 0.018368176306598618
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.03372343271653063,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.03372343271653063
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.030190282453501947,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.030190282453501947
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776678,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776678
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6412213740458015,
"acc_stderr": 0.04206739313864908,
"acc_norm": 0.6412213740458015,
"acc_norm_stderr": 0.04206739313864908
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6319018404907976,
"acc_stderr": 0.03789213935838396,
"acc_norm": 0.6319018404907976,
"acc_norm_stderr": 0.03789213935838396
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384493,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384493
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.02363687331748928,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.02363687331748928
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7739463601532567,
"acc_stderr": 0.014957458504335835,
"acc_norm": 0.7739463601532567,
"acc_norm_stderr": 0.014957458504335835
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6271676300578035,
"acc_stderr": 0.02603389061357628,
"acc_norm": 0.6271676300578035,
"acc_norm_stderr": 0.02603389061357628
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3240223463687151,
"acc_stderr": 0.015652542496421114,
"acc_norm": 0.3240223463687151,
"acc_norm_stderr": 0.015652542496421114
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6078431372549019,
"acc_stderr": 0.027956046165424523,
"acc_norm": 0.6078431372549019,
"acc_norm_stderr": 0.027956046165424523
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6237942122186495,
"acc_stderr": 0.02751392568354943,
"acc_norm": 0.6237942122186495,
"acc_norm_stderr": 0.02751392568354943
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6975308641975309,
"acc_stderr": 0.025557653981868045,
"acc_norm": 0.6975308641975309,
"acc_norm_stderr": 0.025557653981868045
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4078014184397163,
"acc_stderr": 0.029316011776343555,
"acc_norm": 0.4078014184397163,
"acc_norm_stderr": 0.029316011776343555
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41590612777053454,
"acc_stderr": 0.012588323850313608,
"acc_norm": 0.41590612777053454,
"acc_norm_stderr": 0.012588323850313608
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5477941176470589,
"acc_stderr": 0.030233758551596445,
"acc_norm": 0.5477941176470589,
"acc_norm_stderr": 0.030233758551596445
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5784313725490197,
"acc_stderr": 0.019977422600227477,
"acc_norm": 0.5784313725490197,
"acc_norm_stderr": 0.019977422600227477
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6530612244897959,
"acc_stderr": 0.030472526026726496,
"acc_norm": 0.6530612244897959,
"acc_norm_stderr": 0.030472526026726496
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7611940298507462,
"acc_stderr": 0.03014777593540922,
"acc_norm": 0.7611940298507462,
"acc_norm_stderr": 0.03014777593540922
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3011015911872705,
"mc1_stderr": 0.016058999026100612,
"mc2": 0.433460825483405,
"mc2_stderr": 0.01517244922847158
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
nalmeida/securitas | 2023-10-10T14:06:04.000Z | [
"region:us"
] | nalmeida | null | null | null | 0 | 0 | Entry not found |
Coroseven/YotsubaNakano | 2023-10-10T14:10:30.000Z | [
"region:us"
] | Coroseven | null | null | null | 0 | 0 | Entry not found |
Waterfront/social-media-captions | 2023-10-10T18:29:58.000Z | [
"region:us"
] | Waterfront | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_HuggingFaceH4__zephyr-7b-alpha_private | 2023-10-10T14:11:47.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of HuggingFaceH4/zephyr-7b-alpha
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [HuggingFaceH4/zephyr-7b-alpha](https://huggingface.co/HuggingFaceH4/zephyr-7b-alpha)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_HuggingFaceH4__zephyr-7b-alpha_private\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T14:11:13.991325](https://huggingface.co/datasets/open-llm-leaderboard/details_HuggingFaceH4__zephyr-7b-alpha_private/blob/main/results_2023-10-10T14-11-13.991325.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6137978230566867,\n\
\ \"acc_stderr\": 0.03380754595328641,\n \"acc_norm\": 0.6176702382672306,\n\
\ \"acc_norm_stderr\": 0.03378555360789072,\n \"mc1\": 0.42717258261933905,\n\
\ \"mc1_stderr\": 0.017316834410963926,\n \"mc2\": 0.5790339154881958,\n\
\ \"mc2_stderr\": 0.015362629183533977\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5810580204778157,\n \"acc_stderr\": 0.014418106953639011,\n\
\ \"acc_norm\": 0.6100682593856656,\n \"acc_norm_stderr\": 0.01425295984889289\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6409081856203943,\n\
\ \"acc_stderr\": 0.004787537385153006,\n \"acc_norm\": 0.8403704441346346,\n\
\ \"acc_norm_stderr\": 0.0036551361115537096\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.038947344870133176,\n\
\ \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.038947344870133176\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6679245283018868,\n \"acc_stderr\": 0.02898545565233439,\n\
\ \"acc_norm\": 0.6679245283018868,\n \"acc_norm_stderr\": 0.02898545565233439\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n\
\ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n\
\ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\
\ \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n\
\ \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.049135952012744975,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.049135952012744975\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5319148936170213,\n \"acc_stderr\": 0.03261936918467382,\n\
\ \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.03261936918467382\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.38095238095238093,\n \"acc_stderr\": 0.02501074911613761,\n \"\
acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.02501074911613761\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7548387096774194,\n\
\ \"acc_stderr\": 0.02447224384089553,\n \"acc_norm\": 0.7548387096774194,\n\
\ \"acc_norm_stderr\": 0.02447224384089553\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306433,\n\
\ \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306433\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6153846153846154,\n \"acc_stderr\": 0.024666744915187208,\n\
\ \"acc_norm\": 0.6153846153846154,\n \"acc_norm_stderr\": 0.024666744915187208\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857403,\n \
\ \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857403\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.03128217706368461,\n \
\ \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.03128217706368461\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8018348623853211,\n \"acc_stderr\": 0.017090573804217902,\n \"\
acc_norm\": 0.8018348623853211,\n \"acc_norm_stderr\": 0.017090573804217902\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5416666666666666,\n \"acc_stderr\": 0.033981108902946366,\n \"\
acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.033981108902946366\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591361,\n \"\
acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591361\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7426160337552743,\n \"acc_stderr\": 0.028458820991460285,\n \
\ \"acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.028458820991460285\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n\
\ \"acc_stderr\": 0.032277904428505,\n \"acc_norm\": 0.6367713004484304,\n\
\ \"acc_norm_stderr\": 0.032277904428505\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6793893129770993,\n \"acc_stderr\": 0.04093329229834278,\n\
\ \"acc_norm\": 0.6793893129770993,\n \"acc_norm_stderr\": 0.04093329229834278\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302871,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302871\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.043300437496507416,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.043300437496507416\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.02280138253459753,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.02280138253459753\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7905491698595147,\n\
\ \"acc_stderr\": 0.014551310568143704,\n \"acc_norm\": 0.7905491698595147,\n\
\ \"acc_norm_stderr\": 0.014551310568143704\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.0246853168672578,\n\
\ \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.0246853168672578\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37094972067039106,\n\
\ \"acc_stderr\": 0.01615591072134177,\n \"acc_norm\": 0.37094972067039106,\n\
\ \"acc_norm_stderr\": 0.01615591072134177\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6862745098039216,\n \"acc_stderr\": 0.02656892101545715,\n\
\ \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.02656892101545715\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818774,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818774\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6790123456790124,\n \"acc_stderr\": 0.02597656601086274,\n\
\ \"acc_norm\": 0.6790123456790124,\n \"acc_norm_stderr\": 0.02597656601086274\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46099290780141844,\n \"acc_stderr\": 0.029736592526424438,\n \
\ \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.029736592526424438\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41003911342894395,\n\
\ \"acc_stderr\": 0.012561837621962044,\n \"acc_norm\": 0.41003911342894395,\n\
\ \"acc_norm_stderr\": 0.012561837621962044\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6507352941176471,\n \"acc_stderr\": 0.028959755196824866,\n\
\ \"acc_norm\": 0.6507352941176471,\n \"acc_norm_stderr\": 0.028959755196824866\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6290849673202614,\n \"acc_stderr\": 0.019542101564854128,\n \
\ \"acc_norm\": 0.6290849673202614,\n \"acc_norm_stderr\": 0.019542101564854128\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6816326530612244,\n \"acc_stderr\": 0.029822533793982066,\n\
\ \"acc_norm\": 0.6816326530612244,\n \"acc_norm_stderr\": 0.029822533793982066\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8009950248756219,\n\
\ \"acc_stderr\": 0.028231365092758406,\n \"acc_norm\": 0.8009950248756219,\n\
\ \"acc_norm_stderr\": 0.028231365092758406\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.42717258261933905,\n\
\ \"mc1_stderr\": 0.017316834410963926,\n \"mc2\": 0.5790339154881958,\n\
\ \"mc2_stderr\": 0.015362629183533977\n }\n}\n```"
repo_url: https://huggingface.co/HuggingFaceH4/zephyr-7b-alpha
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|arc:challenge|25_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hellaswag|10_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-11-13.991325.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-11-13.991325.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T14-11-13.991325.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T14-11-13.991325.parquet'
- config_name: results
data_files:
- split: 2023_10_10T14_11_13.991325
path:
- results_2023-10-10T14-11-13.991325.parquet
- split: latest
path:
- results_2023-10-10T14-11-13.991325.parquet
---
# Dataset Card for Evaluation run of HuggingFaceH4/zephyr-7b-alpha
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/HuggingFaceH4/zephyr-7b-alpha
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [HuggingFaceH4/zephyr-7b-alpha](https://huggingface.co/HuggingFaceH4/zephyr-7b-alpha) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_HuggingFaceH4__zephyr-7b-alpha_private",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T14:11:13.991325](https://huggingface.co/datasets/open-llm-leaderboard/details_HuggingFaceH4__zephyr-7b-alpha_private/blob/main/results_2023-10-10T14-11-13.991325.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6137978230566867,
"acc_stderr": 0.03380754595328641,
"acc_norm": 0.6176702382672306,
"acc_norm_stderr": 0.03378555360789072,
"mc1": 0.42717258261933905,
"mc1_stderr": 0.017316834410963926,
"mc2": 0.5790339154881958,
"mc2_stderr": 0.015362629183533977
},
"harness|arc:challenge|25": {
"acc": 0.5810580204778157,
"acc_stderr": 0.014418106953639011,
"acc_norm": 0.6100682593856656,
"acc_norm_stderr": 0.01425295984889289
},
"harness|hellaswag|10": {
"acc": 0.6409081856203943,
"acc_stderr": 0.004787537385153006,
"acc_norm": 0.8403704441346346,
"acc_norm_stderr": 0.0036551361115537096
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6447368421052632,
"acc_stderr": 0.038947344870133176,
"acc_norm": 0.6447368421052632,
"acc_norm_stderr": 0.038947344870133176
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6679245283018868,
"acc_stderr": 0.02898545565233439,
"acc_norm": 0.6679245283018868,
"acc_norm_stderr": 0.02898545565233439
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.049135952012744975,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.049135952012744975
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.03261936918467382,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.03261936918467382
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.02501074911613761,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.02501074911613761
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7548387096774194,
"acc_stderr": 0.02447224384089553,
"acc_norm": 0.7548387096774194,
"acc_norm_stderr": 0.02447224384089553
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.025416343096306433,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.025416343096306433
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6153846153846154,
"acc_stderr": 0.024666744915187208,
"acc_norm": 0.6153846153846154,
"acc_norm_stderr": 0.024666744915187208
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.029185714949857403,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.029185714949857403
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.634453781512605,
"acc_stderr": 0.03128217706368461,
"acc_norm": 0.634453781512605,
"acc_norm_stderr": 0.03128217706368461
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8018348623853211,
"acc_stderr": 0.017090573804217902,
"acc_norm": 0.8018348623853211,
"acc_norm_stderr": 0.017090573804217902
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.033981108902946366,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.033981108902946366
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7426160337552743,
"acc_stderr": 0.028458820991460285,
"acc_norm": 0.7426160337552743,
"acc_norm_stderr": 0.028458820991460285
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6367713004484304,
"acc_stderr": 0.032277904428505,
"acc_norm": 0.6367713004484304,
"acc_norm_stderr": 0.032277904428505
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6793893129770993,
"acc_stderr": 0.04093329229834278,
"acc_norm": 0.6793893129770993,
"acc_norm_stderr": 0.04093329229834278
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302871,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302871
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.043300437496507416,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.043300437496507416
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459753,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459753
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7905491698595147,
"acc_stderr": 0.014551310568143704,
"acc_norm": 0.7905491698595147,
"acc_norm_stderr": 0.014551310568143704
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.0246853168672578,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.0246853168672578
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37094972067039106,
"acc_stderr": 0.01615591072134177,
"acc_norm": 0.37094972067039106,
"acc_norm_stderr": 0.01615591072134177
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.02656892101545715,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.02656892101545715
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818774,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818774
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6790123456790124,
"acc_stderr": 0.02597656601086274,
"acc_norm": 0.6790123456790124,
"acc_norm_stderr": 0.02597656601086274
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.029736592526424438,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.029736592526424438
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41003911342894395,
"acc_stderr": 0.012561837621962044,
"acc_norm": 0.41003911342894395,
"acc_norm_stderr": 0.012561837621962044
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6507352941176471,
"acc_stderr": 0.028959755196824866,
"acc_norm": 0.6507352941176471,
"acc_norm_stderr": 0.028959755196824866
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6290849673202614,
"acc_stderr": 0.019542101564854128,
"acc_norm": 0.6290849673202614,
"acc_norm_stderr": 0.019542101564854128
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6816326530612244,
"acc_stderr": 0.029822533793982066,
"acc_norm": 0.6816326530612244,
"acc_norm_stderr": 0.029822533793982066
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8009950248756219,
"acc_stderr": 0.028231365092758406,
"acc_norm": 0.8009950248756219,
"acc_norm_stderr": 0.028231365092758406
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.42717258261933905,
"mc1_stderr": 0.017316834410963926,
"mc2": 0.5790339154881958,
"mc2_stderr": 0.015362629183533977
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_uukuguy__speechless-tora-code-7b-v1.0 | 2023-10-10T14:13:25.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of uukuguy/speechless-tora-code-7b-v1.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [uukuguy/speechless-tora-code-7b-v1.0](https://huggingface.co/uukuguy/speechless-tora-code-7b-v1.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__speechless-tora-code-7b-v1.0\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T14:11:59.032357](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-tora-code-7b-v1.0/blob/main/results_2023-10-10T14-11-59.032357.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3873136911648318,\n\
\ \"acc_stderr\": 0.03488491861505594,\n \"acc_norm\": 0.39082023400364535,\n\
\ \"acc_norm_stderr\": 0.034885243377185654,\n \"mc1\": 0.27539779681762544,\n\
\ \"mc1_stderr\": 0.01563813566777552,\n \"mc2\": 0.4205675471010907,\n\
\ \"mc2_stderr\": 0.014623112128590065\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3822525597269625,\n \"acc_stderr\": 0.014200454049979284,\n\
\ \"acc_norm\": 0.42662116040955633,\n \"acc_norm_stderr\": 0.014453185592920293\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.48904600677155946,\n\
\ \"acc_stderr\": 0.0049885838203099185,\n \"acc_norm\": 0.6515634335789683,\n\
\ \"acc_norm_stderr\": 0.0047550132430221265\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3111111111111111,\n\
\ \"acc_stderr\": 0.03999262876617721,\n \"acc_norm\": 0.3111111111111111,\n\
\ \"acc_norm_stderr\": 0.03999262876617721\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3684210526315789,\n \"acc_stderr\": 0.03925523381052932,\n\
\ \"acc_norm\": 0.3684210526315789,\n \"acc_norm_stderr\": 0.03925523381052932\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.46,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.38113207547169814,\n \"acc_stderr\": 0.029890609686286637,\n\
\ \"acc_norm\": 0.38113207547169814,\n \"acc_norm_stderr\": 0.029890609686286637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3680555555555556,\n\
\ \"acc_stderr\": 0.040329990539607175,\n \"acc_norm\": 0.3680555555555556,\n\
\ \"acc_norm_stderr\": 0.040329990539607175\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\"\
: 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3236994219653179,\n\
\ \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.3236994219653179,\n\
\ \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.043898699568087785,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.043898699568087785\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.33191489361702126,\n \"acc_stderr\": 0.03078373675774565,\n\
\ \"acc_norm\": 0.33191489361702126,\n \"acc_norm_stderr\": 0.03078373675774565\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3586206896551724,\n \"acc_stderr\": 0.03996629574876719,\n\
\ \"acc_norm\": 0.3586206896551724,\n \"acc_norm_stderr\": 0.03996629574876719\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2724867724867725,\n \"acc_stderr\": 0.022930973071633342,\n \"\
acc_norm\": 0.2724867724867725,\n \"acc_norm_stderr\": 0.022930973071633342\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n\
\ \"acc_stderr\": 0.03893259610604674,\n \"acc_norm\": 0.25396825396825395,\n\
\ \"acc_norm_stderr\": 0.03893259610604674\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.38387096774193546,\n\
\ \"acc_stderr\": 0.02766618207553964,\n \"acc_norm\": 0.38387096774193546,\n\
\ \"acc_norm_stderr\": 0.02766618207553964\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3103448275862069,\n \"acc_stderr\": 0.03255086769970103,\n\
\ \"acc_norm\": 0.3103448275862069,\n \"acc_norm_stderr\": 0.03255086769970103\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.4909090909090909,\n \"acc_stderr\": 0.03903698647748441,\n\
\ \"acc_norm\": 0.4909090909090909,\n \"acc_norm_stderr\": 0.03903698647748441\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.48484848484848486,\n \"acc_stderr\": 0.0356071651653106,\n \"\
acc_norm\": 0.48484848484848486,\n \"acc_norm_stderr\": 0.0356071651653106\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.47150259067357514,\n \"acc_stderr\": 0.03602573571288441,\n\
\ \"acc_norm\": 0.47150259067357514,\n \"acc_norm_stderr\": 0.03602573571288441\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.36153846153846153,\n \"acc_stderr\": 0.024359581465396987,\n\
\ \"acc_norm\": 0.36153846153846153,\n \"acc_norm_stderr\": 0.024359581465396987\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.0273091405882302,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.0273091405882302\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3907563025210084,\n \"acc_stderr\": 0.03169380235712997,\n \
\ \"acc_norm\": 0.3907563025210084,\n \"acc_norm_stderr\": 0.03169380235712997\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.25165562913907286,\n \"acc_stderr\": 0.035433042343899844,\n \"\
acc_norm\": 0.25165562913907286,\n \"acc_norm_stderr\": 0.035433042343899844\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.45688073394495415,\n \"acc_stderr\": 0.021357458785226213,\n \"\
acc_norm\": 0.45688073394495415,\n \"acc_norm_stderr\": 0.021357458785226213\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2962962962962963,\n \"acc_stderr\": 0.03114144782353602,\n \"\
acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.03114144782353602\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.4019607843137255,\n \"acc_stderr\": 0.034411900234824655,\n \"\
acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.034411900234824655\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5147679324894515,\n \"acc_stderr\": 0.032533028078777386,\n \
\ \"acc_norm\": 0.5147679324894515,\n \"acc_norm_stderr\": 0.032533028078777386\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4618834080717489,\n\
\ \"acc_stderr\": 0.03346015011973228,\n \"acc_norm\": 0.4618834080717489,\n\
\ \"acc_norm_stderr\": 0.03346015011973228\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3969465648854962,\n \"acc_stderr\": 0.04291135671009223,\n\
\ \"acc_norm\": 0.3969465648854962,\n \"acc_norm_stderr\": 0.04291135671009223\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5454545454545454,\n \"acc_stderr\": 0.04545454545454548,\n \"\
acc_norm\": 0.5454545454545454,\n \"acc_norm_stderr\": 0.04545454545454548\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4351851851851852,\n\
\ \"acc_stderr\": 0.04792898170907062,\n \"acc_norm\": 0.4351851851851852,\n\
\ \"acc_norm_stderr\": 0.04792898170907062\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4294478527607362,\n \"acc_stderr\": 0.03889066619112722,\n\
\ \"acc_norm\": 0.4294478527607362,\n \"acc_norm_stderr\": 0.03889066619112722\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.044328040552915185,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.044328040552915185\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5825242718446602,\n \"acc_stderr\": 0.048828405482122375,\n\
\ \"acc_norm\": 0.5825242718446602,\n \"acc_norm_stderr\": 0.048828405482122375\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6239316239316239,\n\
\ \"acc_stderr\": 0.03173393632969482,\n \"acc_norm\": 0.6239316239316239,\n\
\ \"acc_norm_stderr\": 0.03173393632969482\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4891443167305236,\n\
\ \"acc_stderr\": 0.017875748840242407,\n \"acc_norm\": 0.4891443167305236,\n\
\ \"acc_norm_stderr\": 0.017875748840242407\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4046242774566474,\n \"acc_stderr\": 0.026424816594009852,\n\
\ \"acc_norm\": 0.4046242774566474,\n \"acc_norm_stderr\": 0.026424816594009852\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.26145251396648045,\n\
\ \"acc_stderr\": 0.014696599650364557,\n \"acc_norm\": 0.26145251396648045,\n\
\ \"acc_norm_stderr\": 0.014696599650364557\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.35947712418300654,\n \"acc_stderr\": 0.027475969910660952,\n\
\ \"acc_norm\": 0.35947712418300654,\n \"acc_norm_stderr\": 0.027475969910660952\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4212218649517685,\n\
\ \"acc_stderr\": 0.028043399858210635,\n \"acc_norm\": 0.4212218649517685,\n\
\ \"acc_norm_stderr\": 0.028043399858210635\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.39814814814814814,\n \"acc_stderr\": 0.027237415094592477,\n\
\ \"acc_norm\": 0.39814814814814814,\n \"acc_norm_stderr\": 0.027237415094592477\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.30851063829787234,\n \"acc_stderr\": 0.027553366165101362,\n \
\ \"acc_norm\": 0.30851063829787234,\n \"acc_norm_stderr\": 0.027553366165101362\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.31421121251629724,\n\
\ \"acc_stderr\": 0.011855911587048228,\n \"acc_norm\": 0.31421121251629724,\n\
\ \"acc_norm_stderr\": 0.011855911587048228\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.29044117647058826,\n \"acc_stderr\": 0.027576468622740515,\n\
\ \"acc_norm\": 0.29044117647058826,\n \"acc_norm_stderr\": 0.027576468622740515\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3480392156862745,\n \"acc_stderr\": 0.019270998708223977,\n \
\ \"acc_norm\": 0.3480392156862745,\n \"acc_norm_stderr\": 0.019270998708223977\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.44545454545454544,\n\
\ \"acc_stderr\": 0.047605488214603246,\n \"acc_norm\": 0.44545454545454544,\n\
\ \"acc_norm_stderr\": 0.047605488214603246\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.39183673469387753,\n \"acc_stderr\": 0.031251275910891656,\n\
\ \"acc_norm\": 0.39183673469387753,\n \"acc_norm_stderr\": 0.031251275910891656\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.47761194029850745,\n\
\ \"acc_stderr\": 0.035319879302087305,\n \"acc_norm\": 0.47761194029850745,\n\
\ \"acc_norm_stderr\": 0.035319879302087305\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3674698795180723,\n\
\ \"acc_stderr\": 0.03753267402120575,\n \"acc_norm\": 0.3674698795180723,\n\
\ \"acc_norm_stderr\": 0.03753267402120575\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.4502923976608187,\n \"acc_stderr\": 0.038158273659132366,\n\
\ \"acc_norm\": 0.4502923976608187,\n \"acc_norm_stderr\": 0.038158273659132366\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.27539779681762544,\n\
\ \"mc1_stderr\": 0.01563813566777552,\n \"mc2\": 0.4205675471010907,\n\
\ \"mc2_stderr\": 0.014623112128590065\n }\n}\n```"
repo_url: https://huggingface.co/uukuguy/speechless-tora-code-7b-v1.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|arc:challenge|25_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hellaswag|10_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-11-59.032357.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-11-59.032357.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T14-11-59.032357.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T14-11-59.032357.parquet'
- config_name: results
data_files:
- split: 2023_10_10T14_11_59.032357
path:
- results_2023-10-10T14-11-59.032357.parquet
- split: latest
path:
- results_2023-10-10T14-11-59.032357.parquet
---
# Dataset Card for Evaluation run of uukuguy/speechless-tora-code-7b-v1.0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/uukuguy/speechless-tora-code-7b-v1.0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [uukuguy/speechless-tora-code-7b-v1.0](https://huggingface.co/uukuguy/speechless-tora-code-7b-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_uukuguy__speechless-tora-code-7b-v1.0",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T14:11:59.032357](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-tora-code-7b-v1.0/blob/main/results_2023-10-10T14-11-59.032357.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3873136911648318,
"acc_stderr": 0.03488491861505594,
"acc_norm": 0.39082023400364535,
"acc_norm_stderr": 0.034885243377185654,
"mc1": 0.27539779681762544,
"mc1_stderr": 0.01563813566777552,
"mc2": 0.4205675471010907,
"mc2_stderr": 0.014623112128590065
},
"harness|arc:challenge|25": {
"acc": 0.3822525597269625,
"acc_stderr": 0.014200454049979284,
"acc_norm": 0.42662116040955633,
"acc_norm_stderr": 0.014453185592920293
},
"harness|hellaswag|10": {
"acc": 0.48904600677155946,
"acc_stderr": 0.0049885838203099185,
"acc_norm": 0.6515634335789683,
"acc_norm_stderr": 0.0047550132430221265
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.03999262876617721,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.03999262876617721
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.38113207547169814,
"acc_stderr": 0.029890609686286637,
"acc_norm": 0.38113207547169814,
"acc_norm_stderr": 0.029890609686286637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3680555555555556,
"acc_stderr": 0.040329990539607175,
"acc_norm": 0.3680555555555556,
"acc_norm_stderr": 0.040329990539607175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3236994219653179,
"acc_stderr": 0.0356760379963917,
"acc_norm": 0.3236994219653179,
"acc_norm_stderr": 0.0356760379963917
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.043898699568087785,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.043898699568087785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.33191489361702126,
"acc_stderr": 0.03078373675774565,
"acc_norm": 0.33191489361702126,
"acc_norm_stderr": 0.03078373675774565
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3586206896551724,
"acc_stderr": 0.03996629574876719,
"acc_norm": 0.3586206896551724,
"acc_norm_stderr": 0.03996629574876719
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2724867724867725,
"acc_stderr": 0.022930973071633342,
"acc_norm": 0.2724867724867725,
"acc_norm_stderr": 0.022930973071633342
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.03893259610604674,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.03893259610604674
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.38387096774193546,
"acc_stderr": 0.02766618207553964,
"acc_norm": 0.38387096774193546,
"acc_norm_stderr": 0.02766618207553964
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3103448275862069,
"acc_stderr": 0.03255086769970103,
"acc_norm": 0.3103448275862069,
"acc_norm_stderr": 0.03255086769970103
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.4909090909090909,
"acc_stderr": 0.03903698647748441,
"acc_norm": 0.4909090909090909,
"acc_norm_stderr": 0.03903698647748441
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.48484848484848486,
"acc_stderr": 0.0356071651653106,
"acc_norm": 0.48484848484848486,
"acc_norm_stderr": 0.0356071651653106
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.47150259067357514,
"acc_stderr": 0.03602573571288441,
"acc_norm": 0.47150259067357514,
"acc_norm_stderr": 0.03602573571288441
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.36153846153846153,
"acc_stderr": 0.024359581465396987,
"acc_norm": 0.36153846153846153,
"acc_norm_stderr": 0.024359581465396987
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.0273091405882302,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.0273091405882302
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3907563025210084,
"acc_stderr": 0.03169380235712997,
"acc_norm": 0.3907563025210084,
"acc_norm_stderr": 0.03169380235712997
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.25165562913907286,
"acc_stderr": 0.035433042343899844,
"acc_norm": 0.25165562913907286,
"acc_norm_stderr": 0.035433042343899844
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.45688073394495415,
"acc_stderr": 0.021357458785226213,
"acc_norm": 0.45688073394495415,
"acc_norm_stderr": 0.021357458785226213
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.03114144782353602,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.03114144782353602
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.034411900234824655,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.034411900234824655
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5147679324894515,
"acc_stderr": 0.032533028078777386,
"acc_norm": 0.5147679324894515,
"acc_norm_stderr": 0.032533028078777386
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4618834080717489,
"acc_stderr": 0.03346015011973228,
"acc_norm": 0.4618834080717489,
"acc_norm_stderr": 0.03346015011973228
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3969465648854962,
"acc_stderr": 0.04291135671009223,
"acc_norm": 0.3969465648854962,
"acc_norm_stderr": 0.04291135671009223
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.04545454545454548,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.04545454545454548
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.04792898170907062,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.04792898170907062
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4294478527607362,
"acc_stderr": 0.03889066619112722,
"acc_norm": 0.4294478527607362,
"acc_norm_stderr": 0.03889066619112722
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.044328040552915185,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.044328040552915185
},
"harness|hendrycksTest-management|5": {
"acc": 0.5825242718446602,
"acc_stderr": 0.048828405482122375,
"acc_norm": 0.5825242718446602,
"acc_norm_stderr": 0.048828405482122375
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6239316239316239,
"acc_stderr": 0.03173393632969482,
"acc_norm": 0.6239316239316239,
"acc_norm_stderr": 0.03173393632969482
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.4891443167305236,
"acc_stderr": 0.017875748840242407,
"acc_norm": 0.4891443167305236,
"acc_norm_stderr": 0.017875748840242407
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4046242774566474,
"acc_stderr": 0.026424816594009852,
"acc_norm": 0.4046242774566474,
"acc_norm_stderr": 0.026424816594009852
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.26145251396648045,
"acc_stderr": 0.014696599650364557,
"acc_norm": 0.26145251396648045,
"acc_norm_stderr": 0.014696599650364557
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.35947712418300654,
"acc_stderr": 0.027475969910660952,
"acc_norm": 0.35947712418300654,
"acc_norm_stderr": 0.027475969910660952
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4212218649517685,
"acc_stderr": 0.028043399858210635,
"acc_norm": 0.4212218649517685,
"acc_norm_stderr": 0.028043399858210635
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.39814814814814814,
"acc_stderr": 0.027237415094592477,
"acc_norm": 0.39814814814814814,
"acc_norm_stderr": 0.027237415094592477
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.30851063829787234,
"acc_stderr": 0.027553366165101362,
"acc_norm": 0.30851063829787234,
"acc_norm_stderr": 0.027553366165101362
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.31421121251629724,
"acc_stderr": 0.011855911587048228,
"acc_norm": 0.31421121251629724,
"acc_norm_stderr": 0.011855911587048228
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.29044117647058826,
"acc_stderr": 0.027576468622740515,
"acc_norm": 0.29044117647058826,
"acc_norm_stderr": 0.027576468622740515
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3480392156862745,
"acc_stderr": 0.019270998708223977,
"acc_norm": 0.3480392156862745,
"acc_norm_stderr": 0.019270998708223977
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.44545454545454544,
"acc_stderr": 0.047605488214603246,
"acc_norm": 0.44545454545454544,
"acc_norm_stderr": 0.047605488214603246
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.39183673469387753,
"acc_stderr": 0.031251275910891656,
"acc_norm": 0.39183673469387753,
"acc_norm_stderr": 0.031251275910891656
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.47761194029850745,
"acc_stderr": 0.035319879302087305,
"acc_norm": 0.47761194029850745,
"acc_norm_stderr": 0.035319879302087305
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3674698795180723,
"acc_stderr": 0.03753267402120575,
"acc_norm": 0.3674698795180723,
"acc_norm_stderr": 0.03753267402120575
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.4502923976608187,
"acc_stderr": 0.038158273659132366,
"acc_norm": 0.4502923976608187,
"acc_norm_stderr": 0.038158273659132366
},
"harness|truthfulqa:mc|0": {
"mc1": 0.27539779681762544,
"mc1_stderr": 0.01563813566777552,
"mc2": 0.4205675471010907,
"mc2_stderr": 0.014623112128590065
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_llm-agents__tora-code-7b-v1.0 | 2023-10-10T14:14:10.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of llm-agents/tora-code-7b-v1.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [llm-agents/tora-code-7b-v1.0](https://huggingface.co/llm-agents/tora-code-7b-v1.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_llm-agents__tora-code-7b-v1.0\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T14:12:45.914011](https://huggingface.co/datasets/open-llm-leaderboard/details_llm-agents__tora-code-7b-v1.0/blob/main/results_2023-10-10T14-12-45.914011.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3370080390784741,\n\
\ \"acc_stderr\": 0.03404249985969968,\n \"acc_norm\": 0.34012672459882587,\n\
\ \"acc_norm_stderr\": 0.03404119087529977,\n \"mc1\": 0.22276621787025705,\n\
\ \"mc1_stderr\": 0.014566506961396743,\n \"mc2\": 0.3484016861988524,\n\
\ \"mc2_stderr\": 0.014498737856096499\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.378839590443686,\n \"acc_stderr\": 0.01417591549000032,\n\
\ \"acc_norm\": 0.4069965870307167,\n \"acc_norm_stderr\": 0.014356399418009126\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5027882891854212,\n\
\ \"acc_stderr\": 0.004989703824167102,\n \"acc_norm\": 0.6586337382991436,\n\
\ \"acc_norm_stderr\": 0.004731989816563668\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3037037037037037,\n\
\ \"acc_stderr\": 0.03972552884785137,\n \"acc_norm\": 0.3037037037037037,\n\
\ \"acc_norm_stderr\": 0.03972552884785137\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.26973684210526316,\n \"acc_stderr\": 0.03611780560284898,\n\
\ \"acc_norm\": 0.26973684210526316,\n \"acc_norm_stderr\": 0.03611780560284898\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.46,\n\
\ \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.30566037735849055,\n \"acc_stderr\": 0.028353298073322666,\n\
\ \"acc_norm\": 0.30566037735849055,\n \"acc_norm_stderr\": 0.028353298073322666\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3472222222222222,\n\
\ \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.3472222222222222,\n\
\ \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\
: 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.28901734104046245,\n\
\ \"acc_stderr\": 0.034564257450869995,\n \"acc_norm\": 0.28901734104046245,\n\
\ \"acc_norm_stderr\": 0.034564257450869995\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617749,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617749\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3148936170212766,\n \"acc_stderr\": 0.030363582197238167,\n\
\ \"acc_norm\": 0.3148936170212766,\n \"acc_norm_stderr\": 0.030363582197238167\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.042663394431593935,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.042663394431593935\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3103448275862069,\n \"acc_stderr\": 0.03855289616378948,\n\
\ \"acc_norm\": 0.3103448275862069,\n \"acc_norm_stderr\": 0.03855289616378948\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.28835978835978837,\n \"acc_stderr\": 0.0233306540545359,\n \"\
acc_norm\": 0.28835978835978837,\n \"acc_norm_stderr\": 0.0233306540545359\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.04006168083848878,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.04006168083848878\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.34516129032258064,\n \"acc_stderr\": 0.027045746573534323,\n \"\
acc_norm\": 0.34516129032258064,\n \"acc_norm_stderr\": 0.027045746573534323\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.22660098522167488,\n \"acc_stderr\": 0.029454863835292975,\n \"\
acc_norm\": 0.22660098522167488,\n \"acc_norm_stderr\": 0.029454863835292975\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.48484848484848486,\n \"acc_stderr\": 0.03902551007374448,\n\
\ \"acc_norm\": 0.48484848484848486,\n \"acc_norm_stderr\": 0.03902551007374448\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.29797979797979796,\n \"acc_stderr\": 0.032586303838365555,\n \"\
acc_norm\": 0.29797979797979796,\n \"acc_norm_stderr\": 0.032586303838365555\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.32124352331606215,\n \"acc_stderr\": 0.033699508685490674,\n\
\ \"acc_norm\": 0.32124352331606215,\n \"acc_norm_stderr\": 0.033699508685490674\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.30512820512820515,\n \"acc_stderr\": 0.023346335293325887,\n\
\ \"acc_norm\": 0.30512820512820515,\n \"acc_norm_stderr\": 0.023346335293325887\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24444444444444444,\n \"acc_stderr\": 0.02620276653465215,\n \
\ \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.02620276653465215\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.029344572500634335,\n\
\ \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.029344572500634335\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"\
acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3504587155963303,\n \"acc_stderr\": 0.020456077599824457,\n \"\
acc_norm\": 0.3504587155963303,\n \"acc_norm_stderr\": 0.020456077599824457\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2777777777777778,\n \"acc_stderr\": 0.03054674526495318,\n \"\
acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.03054674526495318\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.37254901960784315,\n \"acc_stderr\": 0.033933885849584046,\n \"\
acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.033933885849584046\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.4978902953586498,\n \"acc_stderr\": 0.032546938018020076,\n \
\ \"acc_norm\": 0.4978902953586498,\n \"acc_norm_stderr\": 0.032546938018020076\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.36771300448430494,\n\
\ \"acc_stderr\": 0.03236198350928275,\n \"acc_norm\": 0.36771300448430494,\n\
\ \"acc_norm_stderr\": 0.03236198350928275\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3816793893129771,\n \"acc_stderr\": 0.0426073515764456,\n\
\ \"acc_norm\": 0.3816793893129771,\n \"acc_norm_stderr\": 0.0426073515764456\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.33884297520661155,\n \"acc_stderr\": 0.04320767807536669,\n \"\
acc_norm\": 0.33884297520661155,\n \"acc_norm_stderr\": 0.04320767807536669\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.39814814814814814,\n\
\ \"acc_stderr\": 0.047323326159788154,\n \"acc_norm\": 0.39814814814814814,\n\
\ \"acc_norm_stderr\": 0.047323326159788154\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.27607361963190186,\n \"acc_stderr\": 0.0351238528370505,\n\
\ \"acc_norm\": 0.27607361963190186,\n \"acc_norm_stderr\": 0.0351238528370505\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.22321428571428573,\n\
\ \"acc_stderr\": 0.039523019677025116,\n \"acc_norm\": 0.22321428571428573,\n\
\ \"acc_norm_stderr\": 0.039523019677025116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3300970873786408,\n \"acc_stderr\": 0.0465614711001235,\n\
\ \"acc_norm\": 0.3300970873786408,\n \"acc_norm_stderr\": 0.0465614711001235\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5042735042735043,\n\
\ \"acc_stderr\": 0.032754892643821316,\n \"acc_norm\": 0.5042735042735043,\n\
\ \"acc_norm_stderr\": 0.032754892643821316\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.421455938697318,\n\
\ \"acc_stderr\": 0.017657976412654857,\n \"acc_norm\": 0.421455938697318,\n\
\ \"acc_norm_stderr\": 0.017657976412654857\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.34104046242774566,\n \"acc_stderr\": 0.025522474632121612,\n\
\ \"acc_norm\": 0.34104046242774566,\n \"acc_norm_stderr\": 0.025522474632121612\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n\
\ \"acc_stderr\": 0.014265554192331144,\n \"acc_norm\": 0.23910614525139665,\n\
\ \"acc_norm_stderr\": 0.014265554192331144\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3562091503267974,\n \"acc_stderr\": 0.02742047766262925,\n\
\ \"acc_norm\": 0.3562091503267974,\n \"acc_norm_stderr\": 0.02742047766262925\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.36977491961414793,\n\
\ \"acc_stderr\": 0.027417996705631,\n \"acc_norm\": 0.36977491961414793,\n\
\ \"acc_norm_stderr\": 0.027417996705631\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.3271604938271605,\n \"acc_stderr\": 0.026105673861409818,\n\
\ \"acc_norm\": 0.3271604938271605,\n \"acc_norm_stderr\": 0.026105673861409818\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2765957446808511,\n \"acc_stderr\": 0.026684564340461004,\n \
\ \"acc_norm\": 0.2765957446808511,\n \"acc_norm_stderr\": 0.026684564340461004\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2926988265971317,\n\
\ \"acc_stderr\": 0.011620949195849535,\n \"acc_norm\": 0.2926988265971317,\n\
\ \"acc_norm_stderr\": 0.011620949195849535\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.22794117647058823,\n \"acc_stderr\": 0.025483081468029807,\n\
\ \"acc_norm\": 0.22794117647058823,\n \"acc_norm_stderr\": 0.025483081468029807\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3055555555555556,\n \"acc_stderr\": 0.018635594034423976,\n \
\ \"acc_norm\": 0.3055555555555556,\n \"acc_norm_stderr\": 0.018635594034423976\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.39090909090909093,\n\
\ \"acc_stderr\": 0.046737523336702363,\n \"acc_norm\": 0.39090909090909093,\n\
\ \"acc_norm_stderr\": 0.046737523336702363\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.03136250240935893,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.03136250240935893\n },\n\
\ \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.4129353233830846,\n\
\ \"acc_stderr\": 0.034815208033673474,\n \"acc_norm\": 0.4129353233830846,\n\
\ \"acc_norm_stderr\": 0.034815208033673474\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3433734939759036,\n\
\ \"acc_stderr\": 0.036965843170106004,\n \"acc_norm\": 0.3433734939759036,\n\
\ \"acc_norm_stderr\": 0.036965843170106004\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.4269005847953216,\n \"acc_stderr\": 0.03793620616529917,\n\
\ \"acc_norm\": 0.4269005847953216,\n \"acc_norm_stderr\": 0.03793620616529917\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22276621787025705,\n\
\ \"mc1_stderr\": 0.014566506961396743,\n \"mc2\": 0.3484016861988524,\n\
\ \"mc2_stderr\": 0.014498737856096499\n }\n}\n```"
repo_url: https://huggingface.co/llm-agents/tora-code-7b-v1.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|arc:challenge|25_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hellaswag|10_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-12-45.914011.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-12-45.914011.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T14-12-45.914011.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T14-12-45.914011.parquet'
- config_name: results
data_files:
- split: 2023_10_10T14_12_45.914011
path:
- results_2023-10-10T14-12-45.914011.parquet
- split: latest
path:
- results_2023-10-10T14-12-45.914011.parquet
---
# Dataset Card for Evaluation run of llm-agents/tora-code-7b-v1.0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/llm-agents/tora-code-7b-v1.0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [llm-agents/tora-code-7b-v1.0](https://huggingface.co/llm-agents/tora-code-7b-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_llm-agents__tora-code-7b-v1.0",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T14:12:45.914011](https://huggingface.co/datasets/open-llm-leaderboard/details_llm-agents__tora-code-7b-v1.0/blob/main/results_2023-10-10T14-12-45.914011.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3370080390784741,
"acc_stderr": 0.03404249985969968,
"acc_norm": 0.34012672459882587,
"acc_norm_stderr": 0.03404119087529977,
"mc1": 0.22276621787025705,
"mc1_stderr": 0.014566506961396743,
"mc2": 0.3484016861988524,
"mc2_stderr": 0.014498737856096499
},
"harness|arc:challenge|25": {
"acc": 0.378839590443686,
"acc_stderr": 0.01417591549000032,
"acc_norm": 0.4069965870307167,
"acc_norm_stderr": 0.014356399418009126
},
"harness|hellaswag|10": {
"acc": 0.5027882891854212,
"acc_stderr": 0.004989703824167102,
"acc_norm": 0.6586337382991436,
"acc_norm_stderr": 0.004731989816563668
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.03972552884785137,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.03972552884785137
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.26973684210526316,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.26973684210526316,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.30566037735849055,
"acc_stderr": 0.028353298073322666,
"acc_norm": 0.30566037735849055,
"acc_norm_stderr": 0.028353298073322666
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3472222222222222,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.3472222222222222,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.28901734104046245,
"acc_stderr": 0.034564257450869995,
"acc_norm": 0.28901734104046245,
"acc_norm_stderr": 0.034564257450869995
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617749,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617749
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3148936170212766,
"acc_stderr": 0.030363582197238167,
"acc_norm": 0.3148936170212766,
"acc_norm_stderr": 0.030363582197238167
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.042663394431593935,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.042663394431593935
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3103448275862069,
"acc_stderr": 0.03855289616378948,
"acc_norm": 0.3103448275862069,
"acc_norm_stderr": 0.03855289616378948
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.28835978835978837,
"acc_stderr": 0.0233306540545359,
"acc_norm": 0.28835978835978837,
"acc_norm_stderr": 0.0233306540545359
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04006168083848878,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04006168083848878
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.34516129032258064,
"acc_stderr": 0.027045746573534323,
"acc_norm": 0.34516129032258064,
"acc_norm_stderr": 0.027045746573534323
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.22660098522167488,
"acc_stderr": 0.029454863835292975,
"acc_norm": 0.22660098522167488,
"acc_norm_stderr": 0.029454863835292975
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.48484848484848486,
"acc_stderr": 0.03902551007374448,
"acc_norm": 0.48484848484848486,
"acc_norm_stderr": 0.03902551007374448
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.29797979797979796,
"acc_stderr": 0.032586303838365555,
"acc_norm": 0.29797979797979796,
"acc_norm_stderr": 0.032586303838365555
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.32124352331606215,
"acc_stderr": 0.033699508685490674,
"acc_norm": 0.32124352331606215,
"acc_norm_stderr": 0.033699508685490674
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.30512820512820515,
"acc_stderr": 0.023346335293325887,
"acc_norm": 0.30512820512820515,
"acc_norm_stderr": 0.023346335293325887
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.02620276653465215,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.02620276653465215
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.029344572500634335,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.029344572500634335
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3504587155963303,
"acc_stderr": 0.020456077599824457,
"acc_norm": 0.3504587155963303,
"acc_norm_stderr": 0.020456077599824457
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.03054674526495318,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.03054674526495318
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.033933885849584046,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.033933885849584046
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4978902953586498,
"acc_stderr": 0.032546938018020076,
"acc_norm": 0.4978902953586498,
"acc_norm_stderr": 0.032546938018020076
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.36771300448430494,
"acc_stderr": 0.03236198350928275,
"acc_norm": 0.36771300448430494,
"acc_norm_stderr": 0.03236198350928275
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3816793893129771,
"acc_stderr": 0.0426073515764456,
"acc_norm": 0.3816793893129771,
"acc_norm_stderr": 0.0426073515764456
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.33884297520661155,
"acc_stderr": 0.04320767807536669,
"acc_norm": 0.33884297520661155,
"acc_norm_stderr": 0.04320767807536669
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.39814814814814814,
"acc_stderr": 0.047323326159788154,
"acc_norm": 0.39814814814814814,
"acc_norm_stderr": 0.047323326159788154
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.27607361963190186,
"acc_stderr": 0.0351238528370505,
"acc_norm": 0.27607361963190186,
"acc_norm_stderr": 0.0351238528370505
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.22321428571428573,
"acc_stderr": 0.039523019677025116,
"acc_norm": 0.22321428571428573,
"acc_norm_stderr": 0.039523019677025116
},
"harness|hendrycksTest-management|5": {
"acc": 0.3300970873786408,
"acc_stderr": 0.0465614711001235,
"acc_norm": 0.3300970873786408,
"acc_norm_stderr": 0.0465614711001235
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5042735042735043,
"acc_stderr": 0.032754892643821316,
"acc_norm": 0.5042735042735043,
"acc_norm_stderr": 0.032754892643821316
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.421455938697318,
"acc_stderr": 0.017657976412654857,
"acc_norm": 0.421455938697318,
"acc_norm_stderr": 0.017657976412654857
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.34104046242774566,
"acc_stderr": 0.025522474632121612,
"acc_norm": 0.34104046242774566,
"acc_norm_stderr": 0.025522474632121612
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331144,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331144
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3562091503267974,
"acc_stderr": 0.02742047766262925,
"acc_norm": 0.3562091503267974,
"acc_norm_stderr": 0.02742047766262925
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.36977491961414793,
"acc_stderr": 0.027417996705631,
"acc_norm": 0.36977491961414793,
"acc_norm_stderr": 0.027417996705631
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3271604938271605,
"acc_stderr": 0.026105673861409818,
"acc_norm": 0.3271604938271605,
"acc_norm_stderr": 0.026105673861409818
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2765957446808511,
"acc_stderr": 0.026684564340461004,
"acc_norm": 0.2765957446808511,
"acc_norm_stderr": 0.026684564340461004
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2926988265971317,
"acc_stderr": 0.011620949195849535,
"acc_norm": 0.2926988265971317,
"acc_norm_stderr": 0.011620949195849535
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.22794117647058823,
"acc_stderr": 0.025483081468029807,
"acc_norm": 0.22794117647058823,
"acc_norm_stderr": 0.025483081468029807
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.018635594034423976,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.018635594034423976
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.39090909090909093,
"acc_stderr": 0.046737523336702363,
"acc_norm": 0.39090909090909093,
"acc_norm_stderr": 0.046737523336702363
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4,
"acc_stderr": 0.03136250240935893,
"acc_norm": 0.4,
"acc_norm_stderr": 0.03136250240935893
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.4129353233830846,
"acc_stderr": 0.034815208033673474,
"acc_norm": 0.4129353233830846,
"acc_norm_stderr": 0.034815208033673474
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3433734939759036,
"acc_stderr": 0.036965843170106004,
"acc_norm": 0.3433734939759036,
"acc_norm_stderr": 0.036965843170106004
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.4269005847953216,
"acc_stderr": 0.03793620616529917,
"acc_norm": 0.4269005847953216,
"acc_norm_stderr": 0.03793620616529917
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22276621787025705,
"mc1_stderr": 0.014566506961396743,
"mc2": 0.3484016861988524,
"mc2_stderr": 0.014498737856096499
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
orgcatorg/israel-hamas-gaza-cnn | 2023-10-11T00:06:16.000Z | [
"region:us"
] | orgcatorg | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: '@type'
dtype: string
- name: headline
dtype: string
- name: url
dtype: string
- name: dateModified
dtype: string
- name: datePublished
dtype: string
- name: mainEntityOfPage
dtype: string
- name: publisher
dtype: string
- name: author
dtype: string
- name: articleBody
dtype: string
- name: image
dtype: string
configs:
- config_name: default
data_files:
- split: train
path: data-*
---
# Dataset Card for "israel-hamas-gaza-cnn"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
namespace-Pt/llm-embedder-data | 2023-10-10T14:46:39.000Z | [
"region:us"
] | namespace-Pt | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_llm-agents__tora-7b-v1.0 | 2023-10-10T14:35:46.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of llm-agents/tora-7b-v1.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [llm-agents/tora-7b-v1.0](https://huggingface.co/llm-agents/tora-7b-v1.0) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_llm-agents__tora-7b-v1.0\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T14:34:11.685092](https://huggingface.co/datasets/open-llm-leaderboard/details_llm-agents__tora-7b-v1.0/blob/main/results_2023-10-10T14-34-11.685092.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.46210391677330115,\n\
\ \"acc_stderr\": 0.0351995874362206,\n \"acc_norm\": 0.4656946953977969,\n\
\ \"acc_norm_stderr\": 0.035185915800634696,\n \"mc1\": 0.26193390452876375,\n\
\ \"mc1_stderr\": 0.015392118805015023,\n \"mc2\": 0.3789924465917188,\n\
\ \"mc2_stderr\": 0.014709754655502841\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.49402730375426623,\n \"acc_stderr\": 0.014610348300255793,\n\
\ \"acc_norm\": 0.5247440273037542,\n \"acc_norm_stderr\": 0.014593487694937735\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6056562437761402,\n\
\ \"acc_stderr\": 0.004877104939356237,\n \"acc_norm\": 0.7867954590718981,\n\
\ \"acc_norm_stderr\": 0.0040873390451062995\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4407894736842105,\n \"acc_stderr\": 0.04040311062490436,\n\
\ \"acc_norm\": 0.4407894736842105,\n \"acc_norm_stderr\": 0.04040311062490436\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.48,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.49433962264150944,\n \"acc_stderr\": 0.030770900763851302,\n\
\ \"acc_norm\": 0.49433962264150944,\n \"acc_norm_stderr\": 0.030770900763851302\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4652777777777778,\n\
\ \"acc_stderr\": 0.04171115858181617,\n \"acc_norm\": 0.4652777777777778,\n\
\ \"acc_norm_stderr\": 0.04171115858181617\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n\
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4277456647398844,\n\
\ \"acc_stderr\": 0.03772446857518026,\n \"acc_norm\": 0.4277456647398844,\n\
\ \"acc_norm_stderr\": 0.03772446857518026\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617746,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617746\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.37872340425531914,\n \"acc_stderr\": 0.03170995606040655,\n\
\ \"acc_norm\": 0.37872340425531914,\n \"acc_norm_stderr\": 0.03170995606040655\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322004,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322004\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30952380952380953,\n \"acc_stderr\": 0.023809523809523857,\n \"\
acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.023809523809523857\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23015873015873015,\n\
\ \"acc_stderr\": 0.037649508797906045,\n \"acc_norm\": 0.23015873015873015,\n\
\ \"acc_norm_stderr\": 0.037649508797906045\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.49032258064516127,\n\
\ \"acc_stderr\": 0.028438677998909558,\n \"acc_norm\": 0.49032258064516127,\n\
\ \"acc_norm_stderr\": 0.028438677998909558\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3399014778325123,\n \"acc_stderr\": 0.033327690684107895,\n\
\ \"acc_norm\": 0.3399014778325123,\n \"acc_norm_stderr\": 0.033327690684107895\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\"\
: 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03825460278380026,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03825460278380026\n },\n\
\ \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.035402943770953675,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.035402943770953675\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\"\
: {\n \"acc\": 0.6839378238341969,\n \"acc_stderr\": 0.033553973696861736,\n\
\ \"acc_norm\": 0.6839378238341969,\n \"acc_norm_stderr\": 0.033553973696861736\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4205128205128205,\n \"acc_stderr\": 0.025028610276710862,\n\
\ \"acc_norm\": 0.4205128205128205,\n \"acc_norm_stderr\": 0.025028610276710862\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085622,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085622\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.0322529423239964,\n \
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.0322529423239964\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.036313298039696545,\n \"\
acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.036313298039696545\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6587155963302752,\n \"acc_stderr\": 0.020328612816592446,\n \"\
acc_norm\": 0.6587155963302752,\n \"acc_norm_stderr\": 0.020328612816592446\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2777777777777778,\n \"acc_stderr\": 0.030546745264953195,\n \"\
acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.030546745264953195\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6078431372549019,\n \"acc_stderr\": 0.03426712349247273,\n \"\
acc_norm\": 0.6078431372549019,\n \"acc_norm_stderr\": 0.03426712349247273\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6033755274261603,\n \"acc_stderr\": 0.03184399873811225,\n \
\ \"acc_norm\": 0.6033755274261603,\n \"acc_norm_stderr\": 0.03184399873811225\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5426008968609866,\n\
\ \"acc_stderr\": 0.033435777055830646,\n \"acc_norm\": 0.5426008968609866,\n\
\ \"acc_norm_stderr\": 0.033435777055830646\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5267175572519084,\n \"acc_stderr\": 0.04379024936553894,\n\
\ \"acc_norm\": 0.5267175572519084,\n \"acc_norm_stderr\": 0.04379024936553894\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6446280991735537,\n \"acc_stderr\": 0.0436923632657398,\n \"acc_norm\"\
: 0.6446280991735537,\n \"acc_norm_stderr\": 0.0436923632657398\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5648148148148148,\n\
\ \"acc_stderr\": 0.04792898170907061,\n \"acc_norm\": 0.5648148148148148,\n\
\ \"acc_norm_stderr\": 0.04792898170907061\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4723926380368098,\n \"acc_stderr\": 0.039223782906109894,\n\
\ \"acc_norm\": 0.4723926380368098,\n \"acc_norm_stderr\": 0.039223782906109894\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5728155339805825,\n \"acc_stderr\": 0.04897957737781168,\n\
\ \"acc_norm\": 0.5728155339805825,\n \"acc_norm_stderr\": 0.04897957737781168\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6837606837606838,\n\
\ \"acc_stderr\": 0.030463656747340265,\n \"acc_norm\": 0.6837606837606838,\n\
\ \"acc_norm_stderr\": 0.030463656747340265\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6232439335887612,\n\
\ \"acc_stderr\": 0.017328292907303047,\n \"acc_norm\": 0.6232439335887612,\n\
\ \"acc_norm_stderr\": 0.017328292907303047\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.48265895953757226,\n \"acc_stderr\": 0.026902900458666647,\n\
\ \"acc_norm\": 0.48265895953757226,\n \"acc_norm_stderr\": 0.026902900458666647\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n\
\ \"acc_stderr\": 0.014265554192331144,\n \"acc_norm\": 0.23910614525139665,\n\
\ \"acc_norm_stderr\": 0.014265554192331144\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4673202614379085,\n \"acc_stderr\": 0.028568699752225868,\n\
\ \"acc_norm\": 0.4673202614379085,\n \"acc_norm_stderr\": 0.028568699752225868\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.572347266881029,\n\
\ \"acc_stderr\": 0.028099240775809553,\n \"acc_norm\": 0.572347266881029,\n\
\ \"acc_norm_stderr\": 0.028099240775809553\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4783950617283951,\n \"acc_stderr\": 0.027794760105008746,\n\
\ \"acc_norm\": 0.4783950617283951,\n \"acc_norm_stderr\": 0.027794760105008746\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3404255319148936,\n \"acc_stderr\": 0.028267657482650144,\n \
\ \"acc_norm\": 0.3404255319148936,\n \"acc_norm_stderr\": 0.028267657482650144\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.35658409387222945,\n\
\ \"acc_stderr\": 0.01223364298927389,\n \"acc_norm\": 0.35658409387222945,\n\
\ \"acc_norm_stderr\": 0.01223364298927389\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4742647058823529,\n \"acc_stderr\": 0.030332578094555033,\n\
\ \"acc_norm\": 0.4742647058823529,\n \"acc_norm_stderr\": 0.030332578094555033\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4297385620915033,\n \"acc_stderr\": 0.020027122784928547,\n \
\ \"acc_norm\": 0.4297385620915033,\n \"acc_norm_stderr\": 0.020027122784928547\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5181818181818182,\n\
\ \"acc_stderr\": 0.04785964010794916,\n \"acc_norm\": 0.5181818181818182,\n\
\ \"acc_norm_stderr\": 0.04785964010794916\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.46122448979591835,\n \"acc_stderr\": 0.03191282052669277,\n\
\ \"acc_norm\": 0.46122448979591835,\n \"acc_norm_stderr\": 0.03191282052669277\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6169154228855721,\n\
\ \"acc_stderr\": 0.0343751933733825,\n \"acc_norm\": 0.6169154228855721,\n\
\ \"acc_norm_stderr\": 0.0343751933733825\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39759036144578314,\n\
\ \"acc_stderr\": 0.038099730845402184,\n \"acc_norm\": 0.39759036144578314,\n\
\ \"acc_norm_stderr\": 0.038099730845402184\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.672514619883041,\n \"acc_stderr\": 0.035993357714560276,\n\
\ \"acc_norm\": 0.672514619883041,\n \"acc_norm_stderr\": 0.035993357714560276\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26193390452876375,\n\
\ \"mc1_stderr\": 0.015392118805015023,\n \"mc2\": 0.3789924465917188,\n\
\ \"mc2_stderr\": 0.014709754655502841\n }\n}\n```"
repo_url: https://huggingface.co/llm-agents/tora-7b-v1.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|arc:challenge|25_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hellaswag|10_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T14-34-11.685092.parquet'
- config_name: results
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- results_2023-10-10T14-34-11.685092.parquet
- split: latest
path:
- results_2023-10-10T14-34-11.685092.parquet
---
# Dataset Card for Evaluation run of llm-agents/tora-7b-v1.0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/llm-agents/tora-7b-v1.0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [llm-agents/tora-7b-v1.0](https://huggingface.co/llm-agents/tora-7b-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_llm-agents__tora-7b-v1.0",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T14:34:11.685092](https://huggingface.co/datasets/open-llm-leaderboard/details_llm-agents__tora-7b-v1.0/blob/main/results_2023-10-10T14-34-11.685092.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.46210391677330115,
"acc_stderr": 0.0351995874362206,
"acc_norm": 0.4656946953977969,
"acc_norm_stderr": 0.035185915800634696,
"mc1": 0.26193390452876375,
"mc1_stderr": 0.015392118805015023,
"mc2": 0.3789924465917188,
"mc2_stderr": 0.014709754655502841
},
"harness|arc:challenge|25": {
"acc": 0.49402730375426623,
"acc_stderr": 0.014610348300255793,
"acc_norm": 0.5247440273037542,
"acc_norm_stderr": 0.014593487694937735
},
"harness|hellaswag|10": {
"acc": 0.6056562437761402,
"acc_stderr": 0.004877104939356237,
"acc_norm": 0.7867954590718981,
"acc_norm_stderr": 0.0040873390451062995
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750575,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750575
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4407894736842105,
"acc_stderr": 0.04040311062490436,
"acc_norm": 0.4407894736842105,
"acc_norm_stderr": 0.04040311062490436
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.49433962264150944,
"acc_stderr": 0.030770900763851302,
"acc_norm": 0.49433962264150944,
"acc_norm_stderr": 0.030770900763851302
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4652777777777778,
"acc_stderr": 0.04171115858181617,
"acc_norm": 0.4652777777777778,
"acc_norm_stderr": 0.04171115858181617
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4277456647398844,
"acc_stderr": 0.03772446857518026,
"acc_norm": 0.4277456647398844,
"acc_norm_stderr": 0.03772446857518026
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617746,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617746
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.37872340425531914,
"acc_stderr": 0.03170995606040655,
"acc_norm": 0.37872340425531914,
"acc_norm_stderr": 0.03170995606040655
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322004,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322004
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.023809523809523857,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.023809523809523857
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23015873015873015,
"acc_stderr": 0.037649508797906045,
"acc_norm": 0.23015873015873015,
"acc_norm_stderr": 0.037649508797906045
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.49032258064516127,
"acc_stderr": 0.028438677998909558,
"acc_norm": 0.49032258064516127,
"acc_norm_stderr": 0.028438677998909558
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3399014778325123,
"acc_stderr": 0.033327690684107895,
"acc_norm": 0.3399014778325123,
"acc_norm_stderr": 0.033327690684107895
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6,
"acc_stderr": 0.03825460278380026,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03825460278380026
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.035402943770953675,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.035402943770953675
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6839378238341969,
"acc_stderr": 0.033553973696861736,
"acc_norm": 0.6839378238341969,
"acc_norm_stderr": 0.033553973696861736
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4205128205128205,
"acc_stderr": 0.025028610276710862,
"acc_norm": 0.4205128205128205,
"acc_norm_stderr": 0.025028610276710862
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085622,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085622
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.0322529423239964,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.0322529423239964
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.036313298039696545,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.036313298039696545
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6587155963302752,
"acc_stderr": 0.020328612816592446,
"acc_norm": 0.6587155963302752,
"acc_norm_stderr": 0.020328612816592446
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.030546745264953195,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.030546745264953195
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6078431372549019,
"acc_stderr": 0.03426712349247273,
"acc_norm": 0.6078431372549019,
"acc_norm_stderr": 0.03426712349247273
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6033755274261603,
"acc_stderr": 0.03184399873811225,
"acc_norm": 0.6033755274261603,
"acc_norm_stderr": 0.03184399873811225
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5426008968609866,
"acc_stderr": 0.033435777055830646,
"acc_norm": 0.5426008968609866,
"acc_norm_stderr": 0.033435777055830646
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5267175572519084,
"acc_stderr": 0.04379024936553894,
"acc_norm": 0.5267175572519084,
"acc_norm_stderr": 0.04379024936553894
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6446280991735537,
"acc_stderr": 0.0436923632657398,
"acc_norm": 0.6446280991735537,
"acc_norm_stderr": 0.0436923632657398
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.04792898170907061,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.04792898170907061
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4723926380368098,
"acc_stderr": 0.039223782906109894,
"acc_norm": 0.4723926380368098,
"acc_norm_stderr": 0.039223782906109894
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.5728155339805825,
"acc_stderr": 0.04897957737781168,
"acc_norm": 0.5728155339805825,
"acc_norm_stderr": 0.04897957737781168
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6837606837606838,
"acc_stderr": 0.030463656747340265,
"acc_norm": 0.6837606837606838,
"acc_norm_stderr": 0.030463656747340265
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6232439335887612,
"acc_stderr": 0.017328292907303047,
"acc_norm": 0.6232439335887612,
"acc_norm_stderr": 0.017328292907303047
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.48265895953757226,
"acc_stderr": 0.026902900458666647,
"acc_norm": 0.48265895953757226,
"acc_norm_stderr": 0.026902900458666647
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331144,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331144
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4673202614379085,
"acc_stderr": 0.028568699752225868,
"acc_norm": 0.4673202614379085,
"acc_norm_stderr": 0.028568699752225868
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.572347266881029,
"acc_stderr": 0.028099240775809553,
"acc_norm": 0.572347266881029,
"acc_norm_stderr": 0.028099240775809553
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4783950617283951,
"acc_stderr": 0.027794760105008746,
"acc_norm": 0.4783950617283951,
"acc_norm_stderr": 0.027794760105008746
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3404255319148936,
"acc_stderr": 0.028267657482650144,
"acc_norm": 0.3404255319148936,
"acc_norm_stderr": 0.028267657482650144
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.35658409387222945,
"acc_stderr": 0.01223364298927389,
"acc_norm": 0.35658409387222945,
"acc_norm_stderr": 0.01223364298927389
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4742647058823529,
"acc_stderr": 0.030332578094555033,
"acc_norm": 0.4742647058823529,
"acc_norm_stderr": 0.030332578094555033
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4297385620915033,
"acc_stderr": 0.020027122784928547,
"acc_norm": 0.4297385620915033,
"acc_norm_stderr": 0.020027122784928547
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5181818181818182,
"acc_stderr": 0.04785964010794916,
"acc_norm": 0.5181818181818182,
"acc_norm_stderr": 0.04785964010794916
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.46122448979591835,
"acc_stderr": 0.03191282052669277,
"acc_norm": 0.46122448979591835,
"acc_norm_stderr": 0.03191282052669277
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6169154228855721,
"acc_stderr": 0.0343751933733825,
"acc_norm": 0.6169154228855721,
"acc_norm_stderr": 0.0343751933733825
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39759036144578314,
"acc_stderr": 0.038099730845402184,
"acc_norm": 0.39759036144578314,
"acc_norm_stderr": 0.038099730845402184
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.672514619883041,
"acc_stderr": 0.035993357714560276,
"acc_norm": 0.672514619883041,
"acc_norm_stderr": 0.035993357714560276
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26193390452876375,
"mc1_stderr": 0.015392118805015023,
"mc2": 0.3789924465917188,
"mc2_stderr": 0.014709754655502841
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
mponty/code_champs_solutions | 2023-10-10T15:34:13.000Z | [
"region:us"
] | mponty | null | null | null | 1 | 0 | ---
dataset_info:
features:
- name: submission_id
dtype: string
- name: problem_id
dtype: string
- name: date
dtype: int64
- name: language
dtype: string
- name: verdict
dtype: string
- name: cpu_time
dtype: int64
- name: memory
dtype: int64
- name: code
dtype: string
- name: source
dtype: string
- name: testcount
dtype: int64
- name: lenght
dtype: int64
splits:
- name: train
num_bytes: 48699691541
num_examples: 34994861
download_size: 18591747965
dataset_size: 48699691541
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "code_champs_solutions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liangyinchen/ADL_hw1 | 2023-10-10T14:49:26.000Z | [
"region:us"
] | liangyinchen | null | null | null | 0 | 0 | Entry not found |
mponty/code_champs_meta | 2023-10-10T15:08:17.000Z | [
"region:us"
] | mponty | null | null | null | 1 | 0 | ---
dataset_info:
features:
- name: problem_id
dtype: string
- name: contest
dtype: string
- name: problem
dtype: string
- name: lang
dtype: string
- name: problem_title
dtype: string
- name: problem_statement
dtype: string
- name: page
dtype: string
- name: long_tags
dtype: string
- name: short_tags
dtype: string
- name: tutorial_link
dtype: string
- name: tutorial_page
dtype: string
splits:
- name: train
num_bytes: 3683371047
num_examples: 16504
download_size: 457183665
dataset_size: 3683371047
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "code_champs_meta"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_lgaalves__tinyllama-1.1b-chat-v0.3_platypus | 2023-10-10T14:56:21.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of lgaalves/tinyllama-1.1b-chat-v0.3_platypus
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lgaalves/tinyllama-1.1b-chat-v0.3_platypus](https://huggingface.co/lgaalves/tinyllama-1.1b-chat-v0.3_platypus)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lgaalves__tinyllama-1.1b-chat-v0.3_platypus\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T14:53:56.428911](https://huggingface.co/datasets/open-llm-leaderboard/details_lgaalves__tinyllama-1.1b-chat-v0.3_platypus/blob/main/results_2023-10-10T14-53-56.428911.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2642090602312682,\n\
\ \"acc_stderr\": 0.03197682325323511,\n \"acc_norm\": 0.2668869926455615,\n\
\ \"acc_norm_stderr\": 0.03198355606162418,\n \"mc1\": 0.23133414932680538,\n\
\ \"mc1_stderr\": 0.01476194517486267,\n \"mc2\": 0.39153421911238995,\n\
\ \"mc2_stderr\": 0.014139728525871488\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2764505119453925,\n \"acc_stderr\": 0.013069662474252428,\n\
\ \"acc_norm\": 0.302901023890785,\n \"acc_norm_stderr\": 0.013428241573185349\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.41963752240589525,\n\
\ \"acc_stderr\": 0.0049249104331063566,\n \"acc_norm\": 0.551185022903804,\n\
\ \"acc_norm_stderr\": 0.00496356702912906\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.32894736842105265,\n \"acc_stderr\": 0.03823428969926604,\n\
\ \"acc_norm\": 0.32894736842105265,\n \"acc_norm_stderr\": 0.03823428969926604\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.27,\n\
\ \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.27,\n \
\ \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.20754716981132076,\n \"acc_stderr\": 0.024959918028911277,\n\
\ \"acc_norm\": 0.20754716981132076,\n \"acc_norm_stderr\": 0.024959918028911277\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366255,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366255\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2543352601156069,\n\
\ \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.2543352601156069,\n\
\ \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237655,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237655\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2170212765957447,\n \"acc_stderr\": 0.026947483121496234,\n\
\ \"acc_norm\": 0.2170212765957447,\n \"acc_norm_stderr\": 0.026947483121496234\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.040493392977481404,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.040493392977481404\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.30344827586206896,\n \"acc_stderr\": 0.038312260488503336,\n\
\ \"acc_norm\": 0.30344827586206896,\n \"acc_norm_stderr\": 0.038312260488503336\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25396825396825395,\n \"acc_stderr\": 0.022418042891113942,\n \"\
acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.022418042891113942\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1984126984126984,\n\
\ \"acc_stderr\": 0.03567016675276863,\n \"acc_norm\": 0.1984126984126984,\n\
\ \"acc_norm_stderr\": 0.03567016675276863\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.23870967741935484,\n\
\ \"acc_stderr\": 0.024251071262208834,\n \"acc_norm\": 0.23870967741935484,\n\
\ \"acc_norm_stderr\": 0.024251071262208834\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.23645320197044334,\n \"acc_stderr\": 0.029896114291733545,\n\
\ \"acc_norm\": 0.23645320197044334,\n \"acc_norm_stderr\": 0.029896114291733545\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2606060606060606,\n \"acc_stderr\": 0.03427743175816525,\n\
\ \"acc_norm\": 0.2606060606060606,\n \"acc_norm_stderr\": 0.03427743175816525\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.25757575757575757,\n \"acc_stderr\": 0.03115626951964686,\n \"\
acc_norm\": 0.25757575757575757,\n \"acc_norm_stderr\": 0.03115626951964686\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.24352331606217617,\n \"acc_stderr\": 0.030975436386845447,\n\
\ \"acc_norm\": 0.24352331606217617,\n \"acc_norm_stderr\": 0.030975436386845447\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2358974358974359,\n \"acc_stderr\": 0.021525965407408726,\n\
\ \"acc_norm\": 0.2358974358974359,\n \"acc_norm_stderr\": 0.021525965407408726\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.02708037281514566,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.02708037281514566\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.20168067226890757,\n \"acc_stderr\": 0.026064313406304527,\n\
\ \"acc_norm\": 0.20168067226890757,\n \"acc_norm_stderr\": 0.026064313406304527\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.23302752293577983,\n \"acc_stderr\": 0.01812566918086148,\n \"\
acc_norm\": 0.23302752293577983,\n \"acc_norm_stderr\": 0.01812566918086148\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2777777777777778,\n \"acc_stderr\": 0.0305467452649532,\n \"acc_norm\"\
: 0.2777777777777778,\n \"acc_norm_stderr\": 0.0305467452649532\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.29411764705882354,\n\
\ \"acc_stderr\": 0.03198001660115071,\n \"acc_norm\": 0.29411764705882354,\n\
\ \"acc_norm_stderr\": 0.03198001660115071\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.2062780269058296,\n\
\ \"acc_stderr\": 0.02715715047956382,\n \"acc_norm\": 0.2062780269058296,\n\
\ \"acc_norm_stderr\": 0.02715715047956382\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.21374045801526717,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.21374045801526717,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.371900826446281,\n \"acc_stderr\": 0.044120158066245044,\n \"\
acc_norm\": 0.371900826446281,\n \"acc_norm_stderr\": 0.044120158066245044\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.28703703703703703,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.28703703703703703,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3128834355828221,\n \"acc_stderr\": 0.036429145782924034,\n\
\ \"acc_norm\": 0.3128834355828221,\n \"acc_norm_stderr\": 0.036429145782924034\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.21428571428571427,\n\
\ \"acc_stderr\": 0.038946411200447915,\n \"acc_norm\": 0.21428571428571427,\n\
\ \"acc_norm_stderr\": 0.038946411200447915\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.22330097087378642,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.22330097087378642,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3034188034188034,\n\
\ \"acc_stderr\": 0.030118210106942635,\n \"acc_norm\": 0.3034188034188034,\n\
\ \"acc_norm_stderr\": 0.030118210106942635\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.17,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26181353767560667,\n\
\ \"acc_stderr\": 0.015720838678445266,\n \"acc_norm\": 0.26181353767560667,\n\
\ \"acc_norm_stderr\": 0.015720838678445266\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2947976878612717,\n \"acc_stderr\": 0.024547617794803835,\n\
\ \"acc_norm\": 0.2947976878612717,\n \"acc_norm_stderr\": 0.024547617794803835\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.024848018263875195,\n\
\ \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.024848018263875195\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.28938906752411575,\n\
\ \"acc_stderr\": 0.025755865922632924,\n \"acc_norm\": 0.28938906752411575,\n\
\ \"acc_norm_stderr\": 0.025755865922632924\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2932098765432099,\n \"acc_stderr\": 0.02532988817190092,\n\
\ \"acc_norm\": 0.2932098765432099,\n \"acc_norm_stderr\": 0.02532988817190092\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.22695035460992907,\n \"acc_stderr\": 0.02498710636564297,\n \
\ \"acc_norm\": 0.22695035460992907,\n \"acc_norm_stderr\": 0.02498710636564297\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2666232073011734,\n\
\ \"acc_stderr\": 0.01129383603161214,\n \"acc_norm\": 0.2666232073011734,\n\
\ \"acc_norm_stderr\": 0.01129383603161214\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.025767252010855956,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.025767252010855956\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25980392156862747,\n \"acc_stderr\": 0.017740899509177795,\n \
\ \"acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.017740899509177795\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n\
\ \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n\
\ \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.24897959183673468,\n \"acc_stderr\": 0.027682979522960227,\n\
\ \"acc_norm\": 0.24897959183673468,\n \"acc_norm_stderr\": 0.027682979522960227\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n\
\ \"acc_stderr\": 0.030360490154014652,\n \"acc_norm\": 0.24378109452736318,\n\
\ \"acc_norm_stderr\": 0.030360490154014652\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.25301204819277107,\n\
\ \"acc_stderr\": 0.03384429155233135,\n \"acc_norm\": 0.25301204819277107,\n\
\ \"acc_norm_stderr\": 0.03384429155233135\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.26900584795321636,\n \"acc_stderr\": 0.0340105262010409,\n\
\ \"acc_norm\": 0.26900584795321636,\n \"acc_norm_stderr\": 0.0340105262010409\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23133414932680538,\n\
\ \"mc1_stderr\": 0.01476194517486267,\n \"mc2\": 0.39153421911238995,\n\
\ \"mc2_stderr\": 0.014139728525871488\n }\n}\n```"
repo_url: https://huggingface.co/lgaalves/tinyllama-1.1b-chat-v0.3_platypus
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|arc:challenge|25_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hellaswag|10_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-53-56.428911.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-53-56.428911.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T14-53-56.428911.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T14-53-56.428911.parquet'
- config_name: results
data_files:
- split: 2023_10_10T14_53_56.428911
path:
- results_2023-10-10T14-53-56.428911.parquet
- split: latest
path:
- results_2023-10-10T14-53-56.428911.parquet
---
# Dataset Card for Evaluation run of lgaalves/tinyllama-1.1b-chat-v0.3_platypus
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/lgaalves/tinyllama-1.1b-chat-v0.3_platypus
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [lgaalves/tinyllama-1.1b-chat-v0.3_platypus](https://huggingface.co/lgaalves/tinyllama-1.1b-chat-v0.3_platypus) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lgaalves__tinyllama-1.1b-chat-v0.3_platypus",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T14:53:56.428911](https://huggingface.co/datasets/open-llm-leaderboard/details_lgaalves__tinyllama-1.1b-chat-v0.3_platypus/blob/main/results_2023-10-10T14-53-56.428911.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2642090602312682,
"acc_stderr": 0.03197682325323511,
"acc_norm": 0.2668869926455615,
"acc_norm_stderr": 0.03198355606162418,
"mc1": 0.23133414932680538,
"mc1_stderr": 0.01476194517486267,
"mc2": 0.39153421911238995,
"mc2_stderr": 0.014139728525871488
},
"harness|arc:challenge|25": {
"acc": 0.2764505119453925,
"acc_stderr": 0.013069662474252428,
"acc_norm": 0.302901023890785,
"acc_norm_stderr": 0.013428241573185349
},
"harness|hellaswag|10": {
"acc": 0.41963752240589525,
"acc_stderr": 0.0049249104331063566,
"acc_norm": 0.551185022903804,
"acc_norm_stderr": 0.00496356702912906
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.32894736842105265,
"acc_stderr": 0.03823428969926604,
"acc_norm": 0.32894736842105265,
"acc_norm_stderr": 0.03823428969926604
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.20754716981132076,
"acc_stderr": 0.024959918028911277,
"acc_norm": 0.20754716981132076,
"acc_norm_stderr": 0.024959918028911277
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366255,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366255
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237655,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237655
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2170212765957447,
"acc_stderr": 0.026947483121496234,
"acc_norm": 0.2170212765957447,
"acc_norm_stderr": 0.026947483121496234
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.040493392977481404,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.040493392977481404
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.30344827586206896,
"acc_stderr": 0.038312260488503336,
"acc_norm": 0.30344827586206896,
"acc_norm_stderr": 0.038312260488503336
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.022418042891113942,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.022418042891113942
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1984126984126984,
"acc_stderr": 0.03567016675276863,
"acc_norm": 0.1984126984126984,
"acc_norm_stderr": 0.03567016675276863
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.23870967741935484,
"acc_stderr": 0.024251071262208834,
"acc_norm": 0.23870967741935484,
"acc_norm_stderr": 0.024251071262208834
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.23645320197044334,
"acc_stderr": 0.029896114291733545,
"acc_norm": 0.23645320197044334,
"acc_norm_stderr": 0.029896114291733545
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2606060606060606,
"acc_stderr": 0.03427743175816525,
"acc_norm": 0.2606060606060606,
"acc_norm_stderr": 0.03427743175816525
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.25757575757575757,
"acc_stderr": 0.03115626951964686,
"acc_norm": 0.25757575757575757,
"acc_norm_stderr": 0.03115626951964686
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.24352331606217617,
"acc_stderr": 0.030975436386845447,
"acc_norm": 0.24352331606217617,
"acc_norm_stderr": 0.030975436386845447
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2358974358974359,
"acc_stderr": 0.021525965407408726,
"acc_norm": 0.2358974358974359,
"acc_norm_stderr": 0.021525965407408726
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.02708037281514566,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.02708037281514566
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.20168067226890757,
"acc_stderr": 0.026064313406304527,
"acc_norm": 0.20168067226890757,
"acc_norm_stderr": 0.026064313406304527
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23302752293577983,
"acc_stderr": 0.01812566918086148,
"acc_norm": 0.23302752293577983,
"acc_norm_stderr": 0.01812566918086148
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.0305467452649532,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.0305467452649532
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.03198001660115071,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.03198001660115071
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.2062780269058296,
"acc_stderr": 0.02715715047956382,
"acc_norm": 0.2062780269058296,
"acc_norm_stderr": 0.02715715047956382
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.21374045801526717,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.21374045801526717,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.371900826446281,
"acc_stderr": 0.044120158066245044,
"acc_norm": 0.371900826446281,
"acc_norm_stderr": 0.044120158066245044
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3128834355828221,
"acc_stderr": 0.036429145782924034,
"acc_norm": 0.3128834355828221,
"acc_norm_stderr": 0.036429145782924034
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.038946411200447915,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.038946411200447915
},
"harness|hendrycksTest-management|5": {
"acc": 0.22330097087378642,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.22330097087378642,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3034188034188034,
"acc_stderr": 0.030118210106942635,
"acc_norm": 0.3034188034188034,
"acc_norm_stderr": 0.030118210106942635
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.17,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.17,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26181353767560667,
"acc_stderr": 0.015720838678445266,
"acc_norm": 0.26181353767560667,
"acc_norm_stderr": 0.015720838678445266
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2947976878612717,
"acc_stderr": 0.024547617794803835,
"acc_norm": 0.2947976878612717,
"acc_norm_stderr": 0.024547617794803835
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.024848018263875195,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.024848018263875195
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.28938906752411575,
"acc_stderr": 0.025755865922632924,
"acc_norm": 0.28938906752411575,
"acc_norm_stderr": 0.025755865922632924
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2932098765432099,
"acc_stderr": 0.02532988817190092,
"acc_norm": 0.2932098765432099,
"acc_norm_stderr": 0.02532988817190092
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.22695035460992907,
"acc_stderr": 0.02498710636564297,
"acc_norm": 0.22695035460992907,
"acc_norm_stderr": 0.02498710636564297
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2666232073011734,
"acc_stderr": 0.01129383603161214,
"acc_norm": 0.2666232073011734,
"acc_norm_stderr": 0.01129383603161214
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.025767252010855956,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.025767252010855956
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.017740899509177795,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.017740899509177795
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24897959183673468,
"acc_stderr": 0.027682979522960227,
"acc_norm": 0.24897959183673468,
"acc_norm_stderr": 0.027682979522960227
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.030360490154014652,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.030360490154014652
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-virology|5": {
"acc": 0.25301204819277107,
"acc_stderr": 0.03384429155233135,
"acc_norm": 0.25301204819277107,
"acc_norm_stderr": 0.03384429155233135
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.26900584795321636,
"acc_stderr": 0.0340105262010409,
"acc_norm": 0.26900584795321636,
"acc_norm_stderr": 0.0340105262010409
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23133414932680538,
"mc1_stderr": 0.01476194517486267,
"mc2": 0.39153421911238995,
"mc2_stderr": 0.014139728525871488
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
simayy/ml4se-test-dataset | 2023-10-10T14:56:21.000Z | [
"region:us"
] | simayy | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_llm-agents__tora-code-13b-v1.0 | 2023-10-10T14:57:44.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of llm-agents/tora-code-13b-v1.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [llm-agents/tora-code-13b-v1.0](https://huggingface.co/llm-agents/tora-code-13b-v1.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_llm-agents__tora-code-13b-v1.0\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T14:56:19.008780](https://huggingface.co/datasets/open-llm-leaderboard/details_llm-agents__tora-code-13b-v1.0/blob/main/results_2023-10-10T14-56-19.008780.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.37026259300970477,\n\
\ \"acc_stderr\": 0.03445992590442932,\n \"acc_norm\": 0.3735486412052473,\n\
\ \"acc_norm_stderr\": 0.03445507842357752,\n \"mc1\": 0.2178702570379437,\n\
\ \"mc1_stderr\": 0.014450846714123899,\n \"mc2\": 0.3498430573399945,\n\
\ \"mc2_stderr\": 0.01469641873096921\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4206484641638225,\n \"acc_stderr\": 0.014426211252508406,\n\
\ \"acc_norm\": 0.4445392491467577,\n \"acc_norm_stderr\": 0.014521226405627077\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.522903804023103,\n\
\ \"acc_stderr\": 0.004984543540932333,\n \"acc_norm\": 0.6928898625771759,\n\
\ \"acc_norm_stderr\": 0.004603527017557854\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816507,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816507\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.34814814814814815,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.34814814814814815,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.40131578947368424,\n \"acc_stderr\": 0.039889037033362836,\n\
\ \"acc_norm\": 0.40131578947368424,\n \"acc_norm_stderr\": 0.039889037033362836\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.4,\n\
\ \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.3660377358490566,\n \"acc_stderr\": 0.029647813539365245,\n\
\ \"acc_norm\": 0.3660377358490566,\n \"acc_norm_stderr\": 0.029647813539365245\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3472222222222222,\n\
\ \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.3472222222222222,\n\
\ \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3236994219653179,\n\
\ \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.3236994219653179,\n\
\ \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.03793281185307809,\n\
\ \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.03793281185307809\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n\
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3446808510638298,\n \"acc_stderr\": 0.03106898596312215,\n\
\ \"acc_norm\": 0.3446808510638298,\n \"acc_norm_stderr\": 0.03106898596312215\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.33793103448275863,\n \"acc_stderr\": 0.0394170763206489,\n\
\ \"acc_norm\": 0.33793103448275863,\n \"acc_norm_stderr\": 0.0394170763206489\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.291005291005291,\n \"acc_stderr\": 0.02339382650048487,\n \"acc_norm\"\
: 0.291005291005291,\n \"acc_norm_stderr\": 0.02339382650048487\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n\
\ \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n\
\ \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.36451612903225805,\n \"acc_stderr\": 0.027379871229943252,\n \"\
acc_norm\": 0.36451612903225805,\n \"acc_norm_stderr\": 0.027379871229943252\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.24630541871921183,\n \"acc_stderr\": 0.030315099285617715,\n \"\
acc_norm\": 0.24630541871921183,\n \"acc_norm_stderr\": 0.030315099285617715\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.4909090909090909,\n \"acc_stderr\": 0.0390369864774844,\n\
\ \"acc_norm\": 0.4909090909090909,\n \"acc_norm_stderr\": 0.0390369864774844\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3838383838383838,\n \"acc_stderr\": 0.03464881675016338,\n \"\
acc_norm\": 0.3838383838383838,\n \"acc_norm_stderr\": 0.03464881675016338\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.39896373056994816,\n \"acc_stderr\": 0.035339990940656964,\n\
\ \"acc_norm\": 0.39896373056994816,\n \"acc_norm_stderr\": 0.035339990940656964\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2743589743589744,\n \"acc_stderr\": 0.022622765767493207,\n\
\ \"acc_norm\": 0.2743589743589744,\n \"acc_norm_stderr\": 0.022622765767493207\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085622,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085622\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2815126050420168,\n \"acc_stderr\": 0.029213549414372184,\n\
\ \"acc_norm\": 0.2815126050420168,\n \"acc_norm_stderr\": 0.029213549414372184\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.41284403669724773,\n \"acc_stderr\": 0.021109128133413906,\n \"\
acc_norm\": 0.41284403669724773,\n \"acc_norm_stderr\": 0.021109128133413906\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.029531221160930918,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.029531221160930918\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.46568627450980393,\n \"acc_stderr\": 0.03501038327635897,\n\
\ \"acc_norm\": 0.46568627450980393,\n \"acc_norm_stderr\": 0.03501038327635897\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5147679324894515,\n \"acc_stderr\": 0.032533028078777386,\n \
\ \"acc_norm\": 0.5147679324894515,\n \"acc_norm_stderr\": 0.032533028078777386\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.42152466367713004,\n\
\ \"acc_stderr\": 0.033141902221106564,\n \"acc_norm\": 0.42152466367713004,\n\
\ \"acc_norm_stderr\": 0.033141902221106564\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.37404580152671757,\n \"acc_stderr\": 0.04243869242230524,\n\
\ \"acc_norm\": 0.37404580152671757,\n \"acc_norm_stderr\": 0.04243869242230524\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.4628099173553719,\n \"acc_stderr\": 0.04551711196104218,\n \"\
acc_norm\": 0.4628099173553719,\n \"acc_norm_stderr\": 0.04551711196104218\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5092592592592593,\n\
\ \"acc_stderr\": 0.04832853553437056,\n \"acc_norm\": 0.5092592592592593,\n\
\ \"acc_norm_stderr\": 0.04832853553437056\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.36809815950920244,\n \"acc_stderr\": 0.03789213935838396,\n\
\ \"acc_norm\": 0.36809815950920244,\n \"acc_norm_stderr\": 0.03789213935838396\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.26785714285714285,\n\
\ \"acc_stderr\": 0.04203277291467762,\n \"acc_norm\": 0.26785714285714285,\n\
\ \"acc_norm_stderr\": 0.04203277291467762\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.4174757281553398,\n \"acc_stderr\": 0.04882840548212238,\n\
\ \"acc_norm\": 0.4174757281553398,\n \"acc_norm_stderr\": 0.04882840548212238\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6111111111111112,\n\
\ \"acc_stderr\": 0.031937057262002924,\n \"acc_norm\": 0.6111111111111112,\n\
\ \"acc_norm_stderr\": 0.031937057262002924\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.47126436781609193,\n\
\ \"acc_stderr\": 0.01785041079438017,\n \"acc_norm\": 0.47126436781609193,\n\
\ \"acc_norm_stderr\": 0.01785041079438017\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.40173410404624277,\n \"acc_stderr\": 0.026394104177643627,\n\
\ \"acc_norm\": 0.40173410404624277,\n \"acc_norm_stderr\": 0.026394104177643627\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217892,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217892\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4084967320261438,\n \"acc_stderr\": 0.02814640599309636,\n\
\ \"acc_norm\": 0.4084967320261438,\n \"acc_norm_stderr\": 0.02814640599309636\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.43086816720257237,\n\
\ \"acc_stderr\": 0.028125340983972714,\n \"acc_norm\": 0.43086816720257237,\n\
\ \"acc_norm_stderr\": 0.028125340983972714\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.3734567901234568,\n \"acc_stderr\": 0.02691500301138015,\n\
\ \"acc_norm\": 0.3734567901234568,\n \"acc_norm_stderr\": 0.02691500301138015\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.29432624113475175,\n \"acc_stderr\": 0.027187127011503793,\n \
\ \"acc_norm\": 0.29432624113475175,\n \"acc_norm_stderr\": 0.027187127011503793\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.29595827900912647,\n\
\ \"acc_stderr\": 0.01165851852527704,\n \"acc_norm\": 0.29595827900912647,\n\
\ \"acc_norm_stderr\": 0.01165851852527704\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.22794117647058823,\n \"acc_stderr\": 0.025483081468029804,\n\
\ \"acc_norm\": 0.22794117647058823,\n \"acc_norm_stderr\": 0.025483081468029804\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.35294117647058826,\n \"acc_stderr\": 0.019333142020797077,\n \
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.019333142020797077\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.42727272727272725,\n\
\ \"acc_stderr\": 0.04738198703545483,\n \"acc_norm\": 0.42727272727272725,\n\
\ \"acc_norm_stderr\": 0.04738198703545483\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.40816326530612246,\n \"acc_stderr\": 0.03146465712827424,\n\
\ \"acc_norm\": 0.40816326530612246,\n \"acc_norm_stderr\": 0.03146465712827424\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.43283582089552236,\n\
\ \"acc_stderr\": 0.03503490923673281,\n \"acc_norm\": 0.43283582089552236,\n\
\ \"acc_norm_stderr\": 0.03503490923673281\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.3674698795180723,\n \"acc_stderr\": 0.03753267402120574,\n\
\ \"acc_norm\": 0.3674698795180723,\n \"acc_norm_stderr\": 0.03753267402120574\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.4327485380116959,\n\
\ \"acc_stderr\": 0.037999786443706066,\n \"acc_norm\": 0.4327485380116959,\n\
\ \"acc_norm_stderr\": 0.037999786443706066\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.2178702570379437,\n \"mc1_stderr\": 0.014450846714123899,\n\
\ \"mc2\": 0.3498430573399945,\n \"mc2_stderr\": 0.01469641873096921\n\
\ }\n}\n```"
repo_url: https://huggingface.co/llm-agents/tora-code-13b-v1.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|arc:challenge|25_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hellaswag|10_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-56-19.008780.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-56-19.008780.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T14-56-19.008780.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T14-56-19.008780.parquet'
- config_name: results
data_files:
- split: 2023_10_10T14_56_19.008780
path:
- results_2023-10-10T14-56-19.008780.parquet
- split: latest
path:
- results_2023-10-10T14-56-19.008780.parquet
---
# Dataset Card for Evaluation run of llm-agents/tora-code-13b-v1.0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/llm-agents/tora-code-13b-v1.0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [llm-agents/tora-code-13b-v1.0](https://huggingface.co/llm-agents/tora-code-13b-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_llm-agents__tora-code-13b-v1.0",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T14:56:19.008780](https://huggingface.co/datasets/open-llm-leaderboard/details_llm-agents__tora-code-13b-v1.0/blob/main/results_2023-10-10T14-56-19.008780.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.37026259300970477,
"acc_stderr": 0.03445992590442932,
"acc_norm": 0.3735486412052473,
"acc_norm_stderr": 0.03445507842357752,
"mc1": 0.2178702570379437,
"mc1_stderr": 0.014450846714123899,
"mc2": 0.3498430573399945,
"mc2_stderr": 0.01469641873096921
},
"harness|arc:challenge|25": {
"acc": 0.4206484641638225,
"acc_stderr": 0.014426211252508406,
"acc_norm": 0.4445392491467577,
"acc_norm_stderr": 0.014521226405627077
},
"harness|hellaswag|10": {
"acc": 0.522903804023103,
"acc_stderr": 0.004984543540932333,
"acc_norm": 0.6928898625771759,
"acc_norm_stderr": 0.004603527017557854
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.40131578947368424,
"acc_stderr": 0.039889037033362836,
"acc_norm": 0.40131578947368424,
"acc_norm_stderr": 0.039889037033362836
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3660377358490566,
"acc_stderr": 0.029647813539365245,
"acc_norm": 0.3660377358490566,
"acc_norm_stderr": 0.029647813539365245
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3472222222222222,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.3472222222222222,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3236994219653179,
"acc_stderr": 0.0356760379963917,
"acc_norm": 0.3236994219653179,
"acc_norm_stderr": 0.0356760379963917
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.03793281185307809,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.03793281185307809
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3446808510638298,
"acc_stderr": 0.03106898596312215,
"acc_norm": 0.3446808510638298,
"acc_norm_stderr": 0.03106898596312215
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.33793103448275863,
"acc_stderr": 0.0394170763206489,
"acc_norm": 0.33793103448275863,
"acc_norm_stderr": 0.0394170763206489
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.291005291005291,
"acc_stderr": 0.02339382650048487,
"acc_norm": 0.291005291005291,
"acc_norm_stderr": 0.02339382650048487
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.36451612903225805,
"acc_stderr": 0.027379871229943252,
"acc_norm": 0.36451612903225805,
"acc_norm_stderr": 0.027379871229943252
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.24630541871921183,
"acc_stderr": 0.030315099285617715,
"acc_norm": 0.24630541871921183,
"acc_norm_stderr": 0.030315099285617715
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.4909090909090909,
"acc_stderr": 0.0390369864774844,
"acc_norm": 0.4909090909090909,
"acc_norm_stderr": 0.0390369864774844
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3838383838383838,
"acc_stderr": 0.03464881675016338,
"acc_norm": 0.3838383838383838,
"acc_norm_stderr": 0.03464881675016338
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.39896373056994816,
"acc_stderr": 0.035339990940656964,
"acc_norm": 0.39896373056994816,
"acc_norm_stderr": 0.035339990940656964
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2743589743589744,
"acc_stderr": 0.022622765767493207,
"acc_norm": 0.2743589743589744,
"acc_norm_stderr": 0.022622765767493207
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085622,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085622
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2815126050420168,
"acc_stderr": 0.029213549414372184,
"acc_norm": 0.2815126050420168,
"acc_norm_stderr": 0.029213549414372184
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.41284403669724773,
"acc_stderr": 0.021109128133413906,
"acc_norm": 0.41284403669724773,
"acc_norm_stderr": 0.021109128133413906
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.25,
"acc_stderr": 0.029531221160930918,
"acc_norm": 0.25,
"acc_norm_stderr": 0.029531221160930918
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.46568627450980393,
"acc_stderr": 0.03501038327635897,
"acc_norm": 0.46568627450980393,
"acc_norm_stderr": 0.03501038327635897
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5147679324894515,
"acc_stderr": 0.032533028078777386,
"acc_norm": 0.5147679324894515,
"acc_norm_stderr": 0.032533028078777386
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.42152466367713004,
"acc_stderr": 0.033141902221106564,
"acc_norm": 0.42152466367713004,
"acc_norm_stderr": 0.033141902221106564
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.37404580152671757,
"acc_stderr": 0.04243869242230524,
"acc_norm": 0.37404580152671757,
"acc_norm_stderr": 0.04243869242230524
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.4628099173553719,
"acc_stderr": 0.04551711196104218,
"acc_norm": 0.4628099173553719,
"acc_norm_stderr": 0.04551711196104218
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.04832853553437056,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.04832853553437056
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.36809815950920244,
"acc_stderr": 0.03789213935838396,
"acc_norm": 0.36809815950920244,
"acc_norm_stderr": 0.03789213935838396
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.26785714285714285,
"acc_stderr": 0.04203277291467762,
"acc_norm": 0.26785714285714285,
"acc_norm_stderr": 0.04203277291467762
},
"harness|hendrycksTest-management|5": {
"acc": 0.4174757281553398,
"acc_stderr": 0.04882840548212238,
"acc_norm": 0.4174757281553398,
"acc_norm_stderr": 0.04882840548212238
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.031937057262002924,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.031937057262002924
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.47126436781609193,
"acc_stderr": 0.01785041079438017,
"acc_norm": 0.47126436781609193,
"acc_norm_stderr": 0.01785041079438017
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.40173410404624277,
"acc_stderr": 0.026394104177643627,
"acc_norm": 0.40173410404624277,
"acc_norm_stderr": 0.026394104177643627
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217892,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217892
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4084967320261438,
"acc_stderr": 0.02814640599309636,
"acc_norm": 0.4084967320261438,
"acc_norm_stderr": 0.02814640599309636
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.43086816720257237,
"acc_stderr": 0.028125340983972714,
"acc_norm": 0.43086816720257237,
"acc_norm_stderr": 0.028125340983972714
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3734567901234568,
"acc_stderr": 0.02691500301138015,
"acc_norm": 0.3734567901234568,
"acc_norm_stderr": 0.02691500301138015
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.29432624113475175,
"acc_stderr": 0.027187127011503793,
"acc_norm": 0.29432624113475175,
"acc_norm_stderr": 0.027187127011503793
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.29595827900912647,
"acc_stderr": 0.01165851852527704,
"acc_norm": 0.29595827900912647,
"acc_norm_stderr": 0.01165851852527704
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.22794117647058823,
"acc_stderr": 0.025483081468029804,
"acc_norm": 0.22794117647058823,
"acc_norm_stderr": 0.025483081468029804
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.019333142020797077,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.019333142020797077
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.42727272727272725,
"acc_stderr": 0.04738198703545483,
"acc_norm": 0.42727272727272725,
"acc_norm_stderr": 0.04738198703545483
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.40816326530612246,
"acc_stderr": 0.03146465712827424,
"acc_norm": 0.40816326530612246,
"acc_norm_stderr": 0.03146465712827424
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.43283582089552236,
"acc_stderr": 0.03503490923673281,
"acc_norm": 0.43283582089552236,
"acc_norm_stderr": 0.03503490923673281
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3674698795180723,
"acc_stderr": 0.03753267402120574,
"acc_norm": 0.3674698795180723,
"acc_norm_stderr": 0.03753267402120574
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.4327485380116959,
"acc_stderr": 0.037999786443706066,
"acc_norm": 0.4327485380116959,
"acc_norm_stderr": 0.037999786443706066
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2178702570379437,
"mc1_stderr": 0.014450846714123899,
"mc2": 0.3498430573399945,
"mc2_stderr": 0.01469641873096921
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-7B-v1 | 2023-10-10T14:58:44.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Severian/ANIMA-Phi-Neptune-Mistral-7B-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Severian/ANIMA-Phi-Neptune-Mistral-7B-v1](https://huggingface.co/Severian/ANIMA-Phi-Neptune-Mistral-7B-v1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-7B-v1\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T14:57:20.867230](https://huggingface.co/datasets/open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-7B-v1/blob/main/results_2023-10-10T14-57-20.867230.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5221924256666464,\n\
\ \"acc_stderr\": 0.03497779761198706,\n \"acc_norm\": 0.5257525929962562,\n\
\ \"acc_norm_stderr\": 0.03496709701060229,\n \"mc1\": 0.4112607099143207,\n\
\ \"mc1_stderr\": 0.01722562708366086,\n \"mc2\": 0.5936287801538656,\n\
\ \"mc2_stderr\": 0.015090925037000012\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.014611390804670088,\n \
\ \"acc_norm\": 0.5290102389078498,\n \"acc_norm_stderr\": 0.01458677635529431\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5657239593706433,\n\
\ \"acc_stderr\": 0.004946485466544624,\n \"acc_norm\": 0.7467635929097789,\n\
\ \"acc_norm_stderr\": 0.0043397644342190655\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4074074074074074,\n\
\ \"acc_stderr\": 0.042446332383532286,\n \"acc_norm\": 0.4074074074074074,\n\
\ \"acc_norm_stderr\": 0.042446332383532286\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4868421052631579,\n \"acc_stderr\": 0.04067533136309174,\n\
\ \"acc_norm\": 0.4868421052631579,\n \"acc_norm_stderr\": 0.04067533136309174\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5849056603773585,\n \"acc_stderr\": 0.03032594578928611,\n\
\ \"acc_norm\": 0.5849056603773585,\n \"acc_norm_stderr\": 0.03032594578928611\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5277777777777778,\n\
\ \"acc_stderr\": 0.04174752578923185,\n \"acc_norm\": 0.5277777777777778,\n\
\ \"acc_norm_stderr\": 0.04174752578923185\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n\
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5375722543352601,\n\
\ \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.5375722543352601,\n\
\ \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929777,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929777\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4553191489361702,\n \"acc_stderr\": 0.03255525359340355,\n\
\ \"acc_norm\": 0.4553191489361702,\n \"acc_norm_stderr\": 0.03255525359340355\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n\
\ \"acc_stderr\": 0.04514496132873633,\n \"acc_norm\": 0.35964912280701755,\n\
\ \"acc_norm_stderr\": 0.04514496132873633\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.35714285714285715,\n \"acc_stderr\": 0.024677862841332783,\n \"\
acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.024677862841332783\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5870967741935483,\n\
\ \"acc_stderr\": 0.028009138125400387,\n \"acc_norm\": 0.5870967741935483,\n\
\ \"acc_norm_stderr\": 0.028009138125400387\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3497536945812808,\n \"acc_stderr\": 0.03355400904969565,\n\
\ \"acc_norm\": 0.3497536945812808,\n \"acc_norm_stderr\": 0.03355400904969565\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.036639749943912434,\n\
\ \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.036639749943912434\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6212121212121212,\n \"acc_stderr\": 0.03456088731993747,\n \"\
acc_norm\": 0.6212121212121212,\n \"acc_norm_stderr\": 0.03456088731993747\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7202072538860104,\n \"acc_stderr\": 0.03239637046735704,\n\
\ \"acc_norm\": 0.7202072538860104,\n \"acc_norm_stderr\": 0.03239637046735704\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4666666666666667,\n \"acc_stderr\": 0.025294608023986476,\n\
\ \"acc_norm\": 0.4666666666666667,\n \"acc_norm_stderr\": 0.025294608023986476\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145665,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145665\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.03242225027115007,\n\
\ \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.03242225027115007\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.710091743119266,\n \"acc_stderr\": 0.0194530666092016,\n \"acc_norm\"\
: 0.710091743119266,\n \"acc_norm_stderr\": 0.0194530666092016\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.37962962962962965,\n\
\ \"acc_stderr\": 0.03309682581119035,\n \"acc_norm\": 0.37962962962962965,\n\
\ \"acc_norm_stderr\": 0.03309682581119035\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03308611113236435,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03308611113236435\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6751054852320675,\n \"acc_stderr\": 0.030486039389105307,\n \
\ \"acc_norm\": 0.6751054852320675,\n \"acc_norm_stderr\": 0.030486039389105307\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6098654708520179,\n\
\ \"acc_stderr\": 0.03273766725459157,\n \"acc_norm\": 0.6098654708520179,\n\
\ \"acc_norm_stderr\": 0.03273766725459157\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5877862595419847,\n \"acc_stderr\": 0.04317171194870255,\n\
\ \"acc_norm\": 0.5877862595419847,\n \"acc_norm_stderr\": 0.04317171194870255\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6198347107438017,\n \"acc_stderr\": 0.04431324501968432,\n \"\
acc_norm\": 0.6198347107438017,\n \"acc_norm_stderr\": 0.04431324501968432\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6388888888888888,\n\
\ \"acc_stderr\": 0.04643454608906275,\n \"acc_norm\": 0.6388888888888888,\n\
\ \"acc_norm_stderr\": 0.04643454608906275\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6073619631901841,\n \"acc_stderr\": 0.03836740907831029,\n\
\ \"acc_norm\": 0.6073619631901841,\n \"acc_norm_stderr\": 0.03836740907831029\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280041,\n\
\ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280041\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.811965811965812,\n\
\ \"acc_stderr\": 0.02559819368665226,\n \"acc_norm\": 0.811965811965812,\n\
\ \"acc_norm_stderr\": 0.02559819368665226\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7113665389527458,\n\
\ \"acc_stderr\": 0.016203792703197793,\n \"acc_norm\": 0.7113665389527458,\n\
\ \"acc_norm_stderr\": 0.016203792703197793\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5404624277456648,\n \"acc_stderr\": 0.026830805998952236,\n\
\ \"acc_norm\": 0.5404624277456648,\n \"acc_norm_stderr\": 0.026830805998952236\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2737430167597765,\n\
\ \"acc_stderr\": 0.014912413096372432,\n \"acc_norm\": 0.2737430167597765,\n\
\ \"acc_norm_stderr\": 0.014912413096372432\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5620915032679739,\n \"acc_stderr\": 0.028408302020332694,\n\
\ \"acc_norm\": 0.5620915032679739,\n \"acc_norm_stderr\": 0.028408302020332694\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5980707395498392,\n\
\ \"acc_stderr\": 0.027846476005930473,\n \"acc_norm\": 0.5980707395498392,\n\
\ \"acc_norm_stderr\": 0.027846476005930473\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6141975308641975,\n \"acc_stderr\": 0.027085401226132143,\n\
\ \"acc_norm\": 0.6141975308641975,\n \"acc_norm_stderr\": 0.027085401226132143\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3617021276595745,\n \"acc_stderr\": 0.028663820147199492,\n \
\ \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.028663820147199492\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.38722294654498046,\n\
\ \"acc_stderr\": 0.012441155326854926,\n \"acc_norm\": 0.38722294654498046,\n\
\ \"acc_norm_stderr\": 0.012441155326854926\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4852941176470588,\n \"acc_stderr\": 0.030359697079046104,\n\
\ \"acc_norm\": 0.4852941176470588,\n \"acc_norm_stderr\": 0.030359697079046104\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5098039215686274,\n \"acc_stderr\": 0.0202239460050743,\n \
\ \"acc_norm\": 0.5098039215686274,\n \"acc_norm_stderr\": 0.0202239460050743\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.04653429807913507,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.04653429807913507\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6244897959183674,\n \"acc_stderr\": 0.03100120903989484,\n\
\ \"acc_norm\": 0.6244897959183674,\n \"acc_norm_stderr\": 0.03100120903989484\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7014925373134329,\n\
\ \"acc_stderr\": 0.03235743789355044,\n \"acc_norm\": 0.7014925373134329,\n\
\ \"acc_norm_stderr\": 0.03235743789355044\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n\
\ \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.43373493975903615,\n\
\ \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7017543859649122,\n \"acc_stderr\": 0.03508771929824563,\n\
\ \"acc_norm\": 0.7017543859649122,\n \"acc_norm_stderr\": 0.03508771929824563\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4112607099143207,\n\
\ \"mc1_stderr\": 0.01722562708366086,\n \"mc2\": 0.5936287801538656,\n\
\ \"mc2_stderr\": 0.015090925037000012\n }\n}\n```"
repo_url: https://huggingface.co/Severian/ANIMA-Phi-Neptune-Mistral-7B-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|arc:challenge|25_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hellaswag|10_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-57-20.867230.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-57-20.867230.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T14-57-20.867230.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T14-57-20.867230.parquet'
- config_name: results
data_files:
- split: 2023_10_10T14_57_20.867230
path:
- results_2023-10-10T14-57-20.867230.parquet
- split: latest
path:
- results_2023-10-10T14-57-20.867230.parquet
---
# Dataset Card for Evaluation run of Severian/ANIMA-Phi-Neptune-Mistral-7B-v1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Severian/ANIMA-Phi-Neptune-Mistral-7B-v1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Severian/ANIMA-Phi-Neptune-Mistral-7B-v1](https://huggingface.co/Severian/ANIMA-Phi-Neptune-Mistral-7B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-7B-v1",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T14:57:20.867230](https://huggingface.co/datasets/open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-7B-v1/blob/main/results_2023-10-10T14-57-20.867230.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5221924256666464,
"acc_stderr": 0.03497779761198706,
"acc_norm": 0.5257525929962562,
"acc_norm_stderr": 0.03496709701060229,
"mc1": 0.4112607099143207,
"mc1_stderr": 0.01722562708366086,
"mc2": 0.5936287801538656,
"mc2_stderr": 0.015090925037000012
},
"harness|arc:challenge|25": {
"acc": 0.5,
"acc_stderr": 0.014611390804670088,
"acc_norm": 0.5290102389078498,
"acc_norm_stderr": 0.01458677635529431
},
"harness|hellaswag|10": {
"acc": 0.5657239593706433,
"acc_stderr": 0.004946485466544624,
"acc_norm": 0.7467635929097789,
"acc_norm_stderr": 0.0043397644342190655
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.042446332383532286,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.042446332383532286
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4868421052631579,
"acc_stderr": 0.04067533136309174,
"acc_norm": 0.4868421052631579,
"acc_norm_stderr": 0.04067533136309174
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5849056603773585,
"acc_stderr": 0.03032594578928611,
"acc_norm": 0.5849056603773585,
"acc_norm_stderr": 0.03032594578928611
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.04174752578923185,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.04174752578923185
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5375722543352601,
"acc_stderr": 0.0380168510452446,
"acc_norm": 0.5375722543352601,
"acc_norm_stderr": 0.0380168510452446
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929777,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929777
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4553191489361702,
"acc_stderr": 0.03255525359340355,
"acc_norm": 0.4553191489361702,
"acc_norm_stderr": 0.03255525359340355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.04514496132873633,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.04514496132873633
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.024677862841332783,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.024677862841332783
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5870967741935483,
"acc_stderr": 0.028009138125400387,
"acc_norm": 0.5870967741935483,
"acc_norm_stderr": 0.028009138125400387
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3497536945812808,
"acc_stderr": 0.03355400904969565,
"acc_norm": 0.3497536945812808,
"acc_norm_stderr": 0.03355400904969565
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.036639749943912434,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.036639749943912434
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6212121212121212,
"acc_stderr": 0.03456088731993747,
"acc_norm": 0.6212121212121212,
"acc_norm_stderr": 0.03456088731993747
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7202072538860104,
"acc_stderr": 0.03239637046735704,
"acc_norm": 0.7202072538860104,
"acc_norm_stderr": 0.03239637046735704
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.025294608023986476,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.025294608023986476
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.027080372815145665,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.027080372815145665
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.03242225027115007,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.03242225027115007
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.710091743119266,
"acc_stderr": 0.0194530666092016,
"acc_norm": 0.710091743119266,
"acc_norm_stderr": 0.0194530666092016
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.37962962962962965,
"acc_stderr": 0.03309682581119035,
"acc_norm": 0.37962962962962965,
"acc_norm_stderr": 0.03309682581119035
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03308611113236435,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03308611113236435
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6751054852320675,
"acc_stderr": 0.030486039389105307,
"acc_norm": 0.6751054852320675,
"acc_norm_stderr": 0.030486039389105307
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6098654708520179,
"acc_stderr": 0.03273766725459157,
"acc_norm": 0.6098654708520179,
"acc_norm_stderr": 0.03273766725459157
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5877862595419847,
"acc_stderr": 0.04317171194870255,
"acc_norm": 0.5877862595419847,
"acc_norm_stderr": 0.04317171194870255
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6198347107438017,
"acc_stderr": 0.04431324501968432,
"acc_norm": 0.6198347107438017,
"acc_norm_stderr": 0.04431324501968432
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04643454608906275,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04643454608906275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6073619631901841,
"acc_stderr": 0.03836740907831029,
"acc_norm": 0.6073619631901841,
"acc_norm_stderr": 0.03836740907831029
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280041,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280041
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.811965811965812,
"acc_stderr": 0.02559819368665226,
"acc_norm": 0.811965811965812,
"acc_norm_stderr": 0.02559819368665226
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7113665389527458,
"acc_stderr": 0.016203792703197793,
"acc_norm": 0.7113665389527458,
"acc_norm_stderr": 0.016203792703197793
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5404624277456648,
"acc_stderr": 0.026830805998952236,
"acc_norm": 0.5404624277456648,
"acc_norm_stderr": 0.026830805998952236
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2737430167597765,
"acc_stderr": 0.014912413096372432,
"acc_norm": 0.2737430167597765,
"acc_norm_stderr": 0.014912413096372432
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5620915032679739,
"acc_stderr": 0.028408302020332694,
"acc_norm": 0.5620915032679739,
"acc_norm_stderr": 0.028408302020332694
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5980707395498392,
"acc_stderr": 0.027846476005930473,
"acc_norm": 0.5980707395498392,
"acc_norm_stderr": 0.027846476005930473
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6141975308641975,
"acc_stderr": 0.027085401226132143,
"acc_norm": 0.6141975308641975,
"acc_norm_stderr": 0.027085401226132143
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3617021276595745,
"acc_stderr": 0.028663820147199492,
"acc_norm": 0.3617021276595745,
"acc_norm_stderr": 0.028663820147199492
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.38722294654498046,
"acc_stderr": 0.012441155326854926,
"acc_norm": 0.38722294654498046,
"acc_norm_stderr": 0.012441155326854926
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4852941176470588,
"acc_stderr": 0.030359697079046104,
"acc_norm": 0.4852941176470588,
"acc_norm_stderr": 0.030359697079046104
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5098039215686274,
"acc_stderr": 0.0202239460050743,
"acc_norm": 0.5098039215686274,
"acc_norm_stderr": 0.0202239460050743
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.04653429807913507,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.04653429807913507
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6244897959183674,
"acc_stderr": 0.03100120903989484,
"acc_norm": 0.6244897959183674,
"acc_norm_stderr": 0.03100120903989484
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7014925373134329,
"acc_stderr": 0.03235743789355044,
"acc_norm": 0.7014925373134329,
"acc_norm_stderr": 0.03235743789355044
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-virology|5": {
"acc": 0.43373493975903615,
"acc_stderr": 0.03858158940685517,
"acc_norm": 0.43373493975903615,
"acc_norm_stderr": 0.03858158940685517
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7017543859649122,
"acc_stderr": 0.03508771929824563,
"acc_norm": 0.7017543859649122,
"acc_norm_stderr": 0.03508771929824563
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4112607099143207,
"mc1_stderr": 0.01722562708366086,
"mc2": 0.5936287801538656,
"mc2_stderr": 0.015090925037000012
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
result-kand2-sdxl-wuerst-karlo/7b7794aa | 2023-10-10T14:58:52.000Z | [
"region:us"
] | result-kand2-sdxl-wuerst-karlo | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 166
num_examples: 10
download_size: 1306
dataset_size: 166
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "7b7794aa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
giuseppemartino/i-SAID_custom_or_1 | 2023-10-10T16:04:49.000Z | [
"region:us"
] | giuseppemartino | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 6362576122.0
num_examples: 840
- name: validation
num_bytes: 905977299.0
num_examples: 99
download_size: 7262651438
dataset_size: 7268553421.0
---
# Dataset Card for "i-SAID_custom_or_1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Rewcifer/radio-llama2-5pct-filtered | 2023-10-10T15:01:31.000Z | [
"region:us"
] | Rewcifer | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 5401871
num_examples: 1000
download_size: 1248779
dataset_size: 5401871
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "radio-llama2-5pct-filtered"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
phanvancongthanh/pubchem_bioassay | 2023-10-10T15:10:38.000Z | [
"region:us"
] | phanvancongthanh | null | null | null | 0 | 0 | Entry not found |
asmallgreenpotato/test-start | 2023-10-10T16:58:44.000Z | [
"region:us"
] | asmallgreenpotato | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_llm-agents__tora-13b-v1.0 | 2023-10-10T15:18:29.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of llm-agents/tora-13b-v1.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [llm-agents/tora-13b-v1.0](https://huggingface.co/llm-agents/tora-13b-v1.0) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_llm-agents__tora-13b-v1.0\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T15:17:02.134278](https://huggingface.co/datasets/open-llm-leaderboard/details_llm-agents__tora-13b-v1.0/blob/main/results_2023-10-10T15-17-02.134278.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5490034558552397,\n\
\ \"acc_stderr\": 0.03446388198618693,\n \"acc_norm\": 0.5527099657969922,\n\
\ \"acc_norm_stderr\": 0.0344447014433355,\n \"mc1\": 0.2802937576499388,\n\
\ \"mc1_stderr\": 0.015723139524608763,\n \"mc2\": 0.4025446800568436,\n\
\ \"mc2_stderr\": 0.015003901494005132\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5580204778156996,\n \"acc_stderr\": 0.014512682523128342,\n\
\ \"acc_norm\": 0.5895904436860068,\n \"acc_norm_stderr\": 0.014374922192642664\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6360286795459071,\n\
\ \"acc_stderr\": 0.004801572028920793,\n \"acc_norm\": 0.8231428002389962,\n\
\ \"acc_norm_stderr\": 0.0038076803311729033\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.5037037037037037,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5197368421052632,\n \"acc_stderr\": 0.040657710025626036,\n\
\ \"acc_norm\": 0.5197368421052632,\n \"acc_norm_stderr\": 0.040657710025626036\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6188679245283019,\n \"acc_stderr\": 0.029890609686286637,\n\
\ \"acc_norm\": 0.6188679245283019,\n \"acc_norm_stderr\": 0.029890609686286637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5763888888888888,\n\
\ \"acc_stderr\": 0.0413212501972337,\n \"acc_norm\": 0.5763888888888888,\n\
\ \"acc_norm_stderr\": 0.0413212501972337\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n\
\ \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.5144508670520231,\n \"acc_stderr\": 0.03810871630454764,\n\
\ \"acc_norm\": 0.5144508670520231,\n \"acc_norm_stderr\": 0.03810871630454764\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2647058823529412,\n\
\ \"acc_stderr\": 0.04389869956808778,\n \"acc_norm\": 0.2647058823529412,\n\
\ \"acc_norm_stderr\": 0.04389869956808778\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4,\n\
\ \"acc_stderr\": 0.03202563076101735,\n \"acc_norm\": 0.4,\n \
\ \"acc_norm_stderr\": 0.03202563076101735\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.2982456140350877,\n \"acc_stderr\": 0.04303684033537315,\n\
\ \"acc_norm\": 0.2982456140350877,\n \"acc_norm_stderr\": 0.04303684033537315\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n \"\
acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.36507936507936506,\n \"acc_stderr\": 0.02479606060269995,\n \"\
acc_norm\": 0.36507936507936506,\n \"acc_norm_stderr\": 0.02479606060269995\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6419354838709678,\n\
\ \"acc_stderr\": 0.027273890594300645,\n \"acc_norm\": 0.6419354838709678,\n\
\ \"acc_norm_stderr\": 0.027273890594300645\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4236453201970443,\n \"acc_stderr\": 0.03476725747649037,\n\
\ \"acc_norm\": 0.4236453201970443,\n \"acc_norm_stderr\": 0.03476725747649037\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6606060606060606,\n \"acc_stderr\": 0.03697442205031596,\n\
\ \"acc_norm\": 0.6606060606060606,\n \"acc_norm_stderr\": 0.03697442205031596\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7121212121212122,\n \"acc_stderr\": 0.03225883512300992,\n \"\
acc_norm\": 0.7121212121212122,\n \"acc_norm_stderr\": 0.03225883512300992\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7875647668393783,\n \"acc_stderr\": 0.029519282616817234,\n\
\ \"acc_norm\": 0.7875647668393783,\n \"acc_norm_stderr\": 0.029519282616817234\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5333333333333333,\n \"acc_stderr\": 0.02529460802398647,\n \
\ \"acc_norm\": 0.5333333333333333,\n \"acc_norm_stderr\": 0.02529460802398647\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.542016806722689,\n \"acc_stderr\": 0.03236361111951941,\n \
\ \"acc_norm\": 0.542016806722689,\n \"acc_norm_stderr\": 0.03236361111951941\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7412844036697248,\n \"acc_stderr\": 0.018776052319619627,\n \"\
acc_norm\": 0.7412844036697248,\n \"acc_norm_stderr\": 0.018776052319619627\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.03350991604696042,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.03350991604696042\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7549019607843137,\n \"acc_stderr\": 0.030190282453501947,\n \"\
acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.030190282453501947\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.70042194092827,\n \"acc_stderr\": 0.029818024749753095,\n \
\ \"acc_norm\": 0.70042194092827,\n \"acc_norm_stderr\": 0.029818024749753095\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6278026905829597,\n\
\ \"acc_stderr\": 0.03244305283008731,\n \"acc_norm\": 0.6278026905829597,\n\
\ \"acc_norm_stderr\": 0.03244305283008731\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n\
\ \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7024793388429752,\n \"acc_stderr\": 0.04173349148083499,\n \"\
acc_norm\": 0.7024793388429752,\n \"acc_norm_stderr\": 0.04173349148083499\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6134969325153374,\n \"acc_stderr\": 0.03825825548848607,\n\
\ \"acc_norm\": 0.6134969325153374,\n \"acc_norm_stderr\": 0.03825825548848607\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.042878587513404565,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.042878587513404565\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8247863247863247,\n\
\ \"acc_stderr\": 0.024904439098918214,\n \"acc_norm\": 0.8247863247863247,\n\
\ \"acc_norm_stderr\": 0.024904439098918214\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956914,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956914\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.735632183908046,\n\
\ \"acc_stderr\": 0.015769984840690518,\n \"acc_norm\": 0.735632183908046,\n\
\ \"acc_norm_stderr\": 0.015769984840690518\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6329479768786127,\n \"acc_stderr\": 0.02595005433765407,\n\
\ \"acc_norm\": 0.6329479768786127,\n \"acc_norm_stderr\": 0.02595005433765407\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27150837988826815,\n\
\ \"acc_stderr\": 0.014874252168095277,\n \"acc_norm\": 0.27150837988826815,\n\
\ \"acc_norm_stderr\": 0.014874252168095277\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5947712418300654,\n \"acc_stderr\": 0.02811092849280907,\n\
\ \"acc_norm\": 0.5947712418300654,\n \"acc_norm_stderr\": 0.02811092849280907\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6205787781350482,\n\
\ \"acc_stderr\": 0.027559949802347813,\n \"acc_norm\": 0.6205787781350482,\n\
\ \"acc_norm_stderr\": 0.027559949802347813\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6141975308641975,\n \"acc_stderr\": 0.027085401226132143,\n\
\ \"acc_norm\": 0.6141975308641975,\n \"acc_norm_stderr\": 0.027085401226132143\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40070921985815605,\n \"acc_stderr\": 0.029233465745573083,\n \
\ \"acc_norm\": 0.40070921985815605,\n \"acc_norm_stderr\": 0.029233465745573083\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.408735332464146,\n\
\ \"acc_stderr\": 0.012555701346703384,\n \"acc_norm\": 0.408735332464146,\n\
\ \"acc_norm_stderr\": 0.012555701346703384\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4889705882352941,\n \"acc_stderr\": 0.030365446477275675,\n\
\ \"acc_norm\": 0.4889705882352941,\n \"acc_norm_stderr\": 0.030365446477275675\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5326797385620915,\n \"acc_stderr\": 0.0201845833591022,\n \
\ \"acc_norm\": 0.5326797385620915,\n \"acc_norm_stderr\": 0.0201845833591022\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5877551020408164,\n \"acc_stderr\": 0.03151236044674269,\n\
\ \"acc_norm\": 0.5877551020408164,\n \"acc_norm_stderr\": 0.03151236044674269\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7213930348258707,\n\
\ \"acc_stderr\": 0.031700561834973086,\n \"acc_norm\": 0.7213930348258707,\n\
\ \"acc_norm_stderr\": 0.031700561834973086\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n\
\ \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.43373493975903615,\n\
\ \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n\
\ \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2802937576499388,\n\
\ \"mc1_stderr\": 0.015723139524608763,\n \"mc2\": 0.4025446800568436,\n\
\ \"mc2_stderr\": 0.015003901494005132\n }\n}\n```"
repo_url: https://huggingface.co/llm-agents/tora-13b-v1.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|arc:challenge|25_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hellaswag|10_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T15-17-02.134278.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T15-17-02.134278.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T15-17-02.134278.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T15-17-02.134278.parquet'
- config_name: results
data_files:
- split: 2023_10_10T15_17_02.134278
path:
- results_2023-10-10T15-17-02.134278.parquet
- split: latest
path:
- results_2023-10-10T15-17-02.134278.parquet
---
# Dataset Card for Evaluation run of llm-agents/tora-13b-v1.0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/llm-agents/tora-13b-v1.0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [llm-agents/tora-13b-v1.0](https://huggingface.co/llm-agents/tora-13b-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_llm-agents__tora-13b-v1.0",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T15:17:02.134278](https://huggingface.co/datasets/open-llm-leaderboard/details_llm-agents__tora-13b-v1.0/blob/main/results_2023-10-10T15-17-02.134278.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5490034558552397,
"acc_stderr": 0.03446388198618693,
"acc_norm": 0.5527099657969922,
"acc_norm_stderr": 0.0344447014433355,
"mc1": 0.2802937576499388,
"mc1_stderr": 0.015723139524608763,
"mc2": 0.4025446800568436,
"mc2_stderr": 0.015003901494005132
},
"harness|arc:challenge|25": {
"acc": 0.5580204778156996,
"acc_stderr": 0.014512682523128342,
"acc_norm": 0.5895904436860068,
"acc_norm_stderr": 0.014374922192642664
},
"harness|hellaswag|10": {
"acc": 0.6360286795459071,
"acc_stderr": 0.004801572028920793,
"acc_norm": 0.8231428002389962,
"acc_norm_stderr": 0.0038076803311729033
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5037037037037037,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.5037037037037037,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5197368421052632,
"acc_stderr": 0.040657710025626036,
"acc_norm": 0.5197368421052632,
"acc_norm_stderr": 0.040657710025626036
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6188679245283019,
"acc_stderr": 0.029890609686286637,
"acc_norm": 0.6188679245283019,
"acc_norm_stderr": 0.029890609686286637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5763888888888888,
"acc_stderr": 0.0413212501972337,
"acc_norm": 0.5763888888888888,
"acc_norm_stderr": 0.0413212501972337
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808778,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808778
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.4,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.02479606060269995,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.02479606060269995
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6419354838709678,
"acc_stderr": 0.027273890594300645,
"acc_norm": 0.6419354838709678,
"acc_norm_stderr": 0.027273890594300645
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4236453201970443,
"acc_stderr": 0.03476725747649037,
"acc_norm": 0.4236453201970443,
"acc_norm_stderr": 0.03476725747649037
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6606060606060606,
"acc_stderr": 0.03697442205031596,
"acc_norm": 0.6606060606060606,
"acc_norm_stderr": 0.03697442205031596
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7121212121212122,
"acc_stderr": 0.03225883512300992,
"acc_norm": 0.7121212121212122,
"acc_norm_stderr": 0.03225883512300992
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7875647668393783,
"acc_stderr": 0.029519282616817234,
"acc_norm": 0.7875647668393783,
"acc_norm_stderr": 0.029519282616817234
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5333333333333333,
"acc_stderr": 0.02529460802398647,
"acc_norm": 0.5333333333333333,
"acc_norm_stderr": 0.02529460802398647
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.542016806722689,
"acc_stderr": 0.03236361111951941,
"acc_norm": 0.542016806722689,
"acc_norm_stderr": 0.03236361111951941
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7412844036697248,
"acc_stderr": 0.018776052319619627,
"acc_norm": 0.7412844036697248,
"acc_norm_stderr": 0.018776052319619627
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.03350991604696042,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.03350991604696042
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.030190282453501947,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.030190282453501947
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.70042194092827,
"acc_stderr": 0.029818024749753095,
"acc_norm": 0.70042194092827,
"acc_norm_stderr": 0.029818024749753095
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6278026905829597,
"acc_stderr": 0.03244305283008731,
"acc_norm": 0.6278026905829597,
"acc_norm_stderr": 0.03244305283008731
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6412213740458015,
"acc_stderr": 0.04206739313864908,
"acc_norm": 0.6412213740458015,
"acc_norm_stderr": 0.04206739313864908
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7024793388429752,
"acc_stderr": 0.04173349148083499,
"acc_norm": 0.7024793388429752,
"acc_norm_stderr": 0.04173349148083499
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6134969325153374,
"acc_stderr": 0.03825825548848607,
"acc_norm": 0.6134969325153374,
"acc_norm_stderr": 0.03825825548848607
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.042878587513404565,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.042878587513404565
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8247863247863247,
"acc_stderr": 0.024904439098918214,
"acc_norm": 0.8247863247863247,
"acc_norm_stderr": 0.024904439098918214
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956914,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956914
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.735632183908046,
"acc_stderr": 0.015769984840690518,
"acc_norm": 0.735632183908046,
"acc_norm_stderr": 0.015769984840690518
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6329479768786127,
"acc_stderr": 0.02595005433765407,
"acc_norm": 0.6329479768786127,
"acc_norm_stderr": 0.02595005433765407
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27150837988826815,
"acc_stderr": 0.014874252168095277,
"acc_norm": 0.27150837988826815,
"acc_norm_stderr": 0.014874252168095277
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5947712418300654,
"acc_stderr": 0.02811092849280907,
"acc_norm": 0.5947712418300654,
"acc_norm_stderr": 0.02811092849280907
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6205787781350482,
"acc_stderr": 0.027559949802347813,
"acc_norm": 0.6205787781350482,
"acc_norm_stderr": 0.027559949802347813
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6141975308641975,
"acc_stderr": 0.027085401226132143,
"acc_norm": 0.6141975308641975,
"acc_norm_stderr": 0.027085401226132143
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40070921985815605,
"acc_stderr": 0.029233465745573083,
"acc_norm": 0.40070921985815605,
"acc_norm_stderr": 0.029233465745573083
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.408735332464146,
"acc_stderr": 0.012555701346703384,
"acc_norm": 0.408735332464146,
"acc_norm_stderr": 0.012555701346703384
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4889705882352941,
"acc_stderr": 0.030365446477275675,
"acc_norm": 0.4889705882352941,
"acc_norm_stderr": 0.030365446477275675
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5326797385620915,
"acc_stderr": 0.0201845833591022,
"acc_norm": 0.5326797385620915,
"acc_norm_stderr": 0.0201845833591022
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5877551020408164,
"acc_stderr": 0.03151236044674269,
"acc_norm": 0.5877551020408164,
"acc_norm_stderr": 0.03151236044674269
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7213930348258707,
"acc_stderr": 0.031700561834973086,
"acc_norm": 0.7213930348258707,
"acc_norm_stderr": 0.031700561834973086
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.43373493975903615,
"acc_stderr": 0.03858158940685517,
"acc_norm": 0.43373493975903615,
"acc_norm_stderr": 0.03858158940685517
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2802937576499388,
"mc1_stderr": 0.015723139524608763,
"mc2": 0.4025446800568436,
"mc2_stderr": 0.015003901494005132
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
text2font/words_with_path_tags_version_2 | 2023-10-10T15:20:12.000Z | [
"region:us"
] | text2font | null | null | null | 0 | 0 | Entry not found |
text2font/words_with_path_tags_version_2_splitted | 2023-10-10T15:20:24.000Z | [
"region:us"
] | text2font | null | null | null | 0 | 0 | Entry not found |
hacktoberfest-corpus-es/colmbian_spanish_news | 2023-10-10T15:31:25.000Z | [
"license:cc-by-2.0",
"region:us"
] | hacktoberfest-corpus-es | null | null | null | 0 | 0 | ---
license: cc-by-2.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: news_id
dtype: string
- name: news_url_absolute
dtype: string
- name: news_init_date
dtype: string
- name: news_final_date
dtype: string
- name: news_title
dtype: string
- name: news_text_content
dtype: string
- name: entailment
dtype: float64
- name: category
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 262518060.61903325
num_examples: 60920
- name: test
num_bytes: 13130212.257160116
num_examples: 3047
- name: valid
num_bytes: 52503612.12380665
num_examples: 12184
download_size: 195538787
dataset_size: 328151885.0
---
|
open-llm-leaderboard/details_itsliupeng__llama2_7b_mmlu | 2023-10-10T15:26:48.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of itsliupeng/llama2_7b_mmlu
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [itsliupeng/llama2_7b_mmlu](https://huggingface.co/itsliupeng/llama2_7b_mmlu)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_itsliupeng__llama2_7b_mmlu\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T15:25:23.413789](https://huggingface.co/datasets/open-llm-leaderboard/details_itsliupeng__llama2_7b_mmlu/blob/main/results_2023-10-10T15-25-23.413789.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5988501243208318,\n\
\ \"acc_stderr\": 0.03358876037616636,\n \"acc_norm\": 0.6029683890136457,\n\
\ \"acc_norm_stderr\": 0.03357264852090248,\n \"mc1\": 0.27050183598531213,\n\
\ \"mc1_stderr\": 0.015550778332842888,\n \"mc2\": 0.40950657377856753,\n\
\ \"mc2_stderr\": 0.013879529639480087\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5179180887372014,\n \"acc_stderr\": 0.014602005585490973,\n\
\ \"acc_norm\": 0.5614334470989761,\n \"acc_norm_stderr\": 0.014500682618212865\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5918143796056562,\n\
\ \"acc_stderr\": 0.004904933500255873,\n \"acc_norm\": 0.7912766381198965,\n\
\ \"acc_norm_stderr\": 0.004055657006965434\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5855263157894737,\n \"acc_stderr\": 0.04008973785779206,\n\
\ \"acc_norm\": 0.5855263157894737,\n \"acc_norm_stderr\": 0.04008973785779206\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6490566037735849,\n \"acc_stderr\": 0.02937364625323469,\n\
\ \"acc_norm\": 0.6490566037735849,\n \"acc_norm_stderr\": 0.02937364625323469\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6527777777777778,\n\
\ \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.6527777777777778,\n\
\ \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105654,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5191489361702127,\n \"acc_stderr\": 0.032662042990646796,\n\
\ \"acc_norm\": 0.5191489361702127,\n \"acc_norm_stderr\": 0.032662042990646796\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n\
\ \"acc_stderr\": 0.044045561573747664,\n \"acc_norm\": 0.32456140350877194,\n\
\ \"acc_norm_stderr\": 0.044045561573747664\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3888888888888889,\n \"acc_stderr\": 0.02510742548113729,\n \"\
acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.02510742548113729\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.043062412591271526,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.043062412591271526\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6903225806451613,\n \"acc_stderr\": 0.026302774983517418,\n \"\
acc_norm\": 0.6903225806451613,\n \"acc_norm_stderr\": 0.026302774983517418\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.46798029556650245,\n \"acc_stderr\": 0.035107665979592154,\n \"\
acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.035107665979592154\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7424242424242424,\n \"acc_stderr\": 0.03115626951964683,\n \"\
acc_norm\": 0.7424242424242424,\n \"acc_norm_stderr\": 0.03115626951964683\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121437,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121437\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6,\n \"acc_stderr\": 0.02483881198803316,\n \"acc_norm\"\
: 0.6,\n \"acc_norm_stderr\": 0.02483881198803316\n },\n \"harness|hendrycksTest-high_school_mathematics|5\"\
: {\n \"acc\": 0.3111111111111111,\n \"acc_stderr\": 0.02822644674968352,\n\
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.02822644674968352\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6260504201680672,\n \"acc_stderr\": 0.031429466378837076,\n\
\ \"acc_norm\": 0.6260504201680672,\n \"acc_norm_stderr\": 0.031429466378837076\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7853211009174312,\n \"acc_stderr\": 0.01760430414925648,\n \"\
acc_norm\": 0.7853211009174312,\n \"acc_norm_stderr\": 0.01760430414925648\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4166666666666667,\n \"acc_stderr\": 0.03362277436608043,\n \"\
acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.03362277436608043\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7303921568627451,\n \"acc_stderr\": 0.031145570659486782,\n \"\
acc_norm\": 0.7303921568627451,\n \"acc_norm_stderr\": 0.031145570659486782\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.02574490253229093,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.02574490253229093\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.031602951437766785,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.031602951437766785\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7022900763358778,\n \"acc_stderr\": 0.040103589424622034,\n\
\ \"acc_norm\": 0.7022900763358778,\n \"acc_norm_stderr\": 0.040103589424622034\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990944,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990944\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.04521829902833585,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.04521829902833585\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8290598290598291,\n\
\ \"acc_stderr\": 0.024662496845209807,\n \"acc_norm\": 0.8290598290598291,\n\
\ \"acc_norm_stderr\": 0.024662496845209807\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.776500638569604,\n\
\ \"acc_stderr\": 0.01489723522945071,\n \"acc_norm\": 0.776500638569604,\n\
\ \"acc_norm_stderr\": 0.01489723522945071\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6965317919075145,\n \"acc_stderr\": 0.024752411960917205,\n\
\ \"acc_norm\": 0.6965317919075145,\n \"acc_norm_stderr\": 0.024752411960917205\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3407821229050279,\n\
\ \"acc_stderr\": 0.015852002449862096,\n \"acc_norm\": 0.3407821229050279,\n\
\ \"acc_norm_stderr\": 0.015852002449862096\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6895424836601307,\n \"acc_stderr\": 0.026493033225145898,\n\
\ \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.026493033225145898\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.02558306248998482,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.02558306248998482\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.02604176620271716,\n\
\ \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.02604176620271716\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \
\ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43285528031290743,\n\
\ \"acc_stderr\": 0.012654565234622864,\n \"acc_norm\": 0.43285528031290743,\n\
\ \"acc_norm_stderr\": 0.012654565234622864\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5808823529411765,\n \"acc_stderr\": 0.02997280717046462,\n\
\ \"acc_norm\": 0.5808823529411765,\n \"acc_norm_stderr\": 0.02997280717046462\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6209150326797386,\n \"acc_stderr\": 0.019627444748412236,\n \
\ \"acc_norm\": 0.6209150326797386,\n \"acc_norm_stderr\": 0.019627444748412236\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.04582004841505417,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.04582004841505417\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.02904308868330433,\n\
\ \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.02904308868330433\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7960199004975125,\n\
\ \"acc_stderr\": 0.02849317624532607,\n \"acc_norm\": 0.7960199004975125,\n\
\ \"acc_norm_stderr\": 0.02849317624532607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774707,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774707\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.030944459778533193,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.030944459778533193\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.27050183598531213,\n\
\ \"mc1_stderr\": 0.015550778332842888,\n \"mc2\": 0.40950657377856753,\n\
\ \"mc2_stderr\": 0.013879529639480087\n }\n}\n```"
repo_url: https://huggingface.co/itsliupeng/llama2_7b_mmlu
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|arc:challenge|25_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hellaswag|10_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T15-25-23.413789.parquet'
- config_name: results
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- results_2023-10-10T15-25-23.413789.parquet
- split: latest
path:
- results_2023-10-10T15-25-23.413789.parquet
---
# Dataset Card for Evaluation run of itsliupeng/llama2_7b_mmlu
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/itsliupeng/llama2_7b_mmlu
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [itsliupeng/llama2_7b_mmlu](https://huggingface.co/itsliupeng/llama2_7b_mmlu) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_itsliupeng__llama2_7b_mmlu",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T15:25:23.413789](https://huggingface.co/datasets/open-llm-leaderboard/details_itsliupeng__llama2_7b_mmlu/blob/main/results_2023-10-10T15-25-23.413789.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5988501243208318,
"acc_stderr": 0.03358876037616636,
"acc_norm": 0.6029683890136457,
"acc_norm_stderr": 0.03357264852090248,
"mc1": 0.27050183598531213,
"mc1_stderr": 0.015550778332842888,
"mc2": 0.40950657377856753,
"mc2_stderr": 0.013879529639480087
},
"harness|arc:challenge|25": {
"acc": 0.5179180887372014,
"acc_stderr": 0.014602005585490973,
"acc_norm": 0.5614334470989761,
"acc_norm_stderr": 0.014500682618212865
},
"harness|hellaswag|10": {
"acc": 0.5918143796056562,
"acc_stderr": 0.004904933500255873,
"acc_norm": 0.7912766381198965,
"acc_norm_stderr": 0.004055657006965434
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5855263157894737,
"acc_stderr": 0.04008973785779206,
"acc_norm": 0.5855263157894737,
"acc_norm_stderr": 0.04008973785779206
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6490566037735849,
"acc_stderr": 0.02937364625323469,
"acc_norm": 0.6490566037735849,
"acc_norm_stderr": 0.02937364625323469
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6527777777777778,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.6527777777777778,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105654,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5191489361702127,
"acc_stderr": 0.032662042990646796,
"acc_norm": 0.5191489361702127,
"acc_norm_stderr": 0.032662042990646796
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.044045561573747664,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.044045561573747664
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.02510742548113729,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.02510742548113729
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.043062412591271526,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.043062412591271526
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6903225806451613,
"acc_stderr": 0.026302774983517418,
"acc_norm": 0.6903225806451613,
"acc_norm_stderr": 0.026302774983517418
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.46798029556650245,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.46798029556650245,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7424242424242424,
"acc_stderr": 0.03115626951964683,
"acc_norm": 0.7424242424242424,
"acc_norm_stderr": 0.03115626951964683
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121437,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121437
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6,
"acc_stderr": 0.02483881198803316,
"acc_norm": 0.6,
"acc_norm_stderr": 0.02483881198803316
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.02822644674968352,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.02822644674968352
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6260504201680672,
"acc_stderr": 0.031429466378837076,
"acc_norm": 0.6260504201680672,
"acc_norm_stderr": 0.031429466378837076
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7853211009174312,
"acc_stderr": 0.01760430414925648,
"acc_norm": 0.7853211009174312,
"acc_norm_stderr": 0.01760430414925648
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.03362277436608043,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.03362277436608043
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7303921568627451,
"acc_stderr": 0.031145570659486782,
"acc_norm": 0.7303921568627451,
"acc_norm_stderr": 0.031145570659486782
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.02574490253229093,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.02574490253229093
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.031602951437766785,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.031602951437766785
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7022900763358778,
"acc_stderr": 0.040103589424622034,
"acc_norm": 0.7022900763358778,
"acc_norm_stderr": 0.040103589424622034
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990944,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990944
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833585,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833585
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8290598290598291,
"acc_stderr": 0.024662496845209807,
"acc_norm": 0.8290598290598291,
"acc_norm_stderr": 0.024662496845209807
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.776500638569604,
"acc_stderr": 0.01489723522945071,
"acc_norm": 0.776500638569604,
"acc_norm_stderr": 0.01489723522945071
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6965317919075145,
"acc_stderr": 0.024752411960917205,
"acc_norm": 0.6965317919075145,
"acc_norm_stderr": 0.024752411960917205
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3407821229050279,
"acc_stderr": 0.015852002449862096,
"acc_norm": 0.3407821229050279,
"acc_norm_stderr": 0.015852002449862096
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.026493033225145898,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.026493033225145898
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.02558306248998482,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.02558306248998482
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.02604176620271716,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.02604176620271716
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43285528031290743,
"acc_stderr": 0.012654565234622864,
"acc_norm": 0.43285528031290743,
"acc_norm_stderr": 0.012654565234622864
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5808823529411765,
"acc_stderr": 0.02997280717046462,
"acc_norm": 0.5808823529411765,
"acc_norm_stderr": 0.02997280717046462
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6209150326797386,
"acc_stderr": 0.019627444748412236,
"acc_norm": 0.6209150326797386,
"acc_norm_stderr": 0.019627444748412236
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.04582004841505417,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.04582004841505417
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.02904308868330433,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.02904308868330433
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7960199004975125,
"acc_stderr": 0.02849317624532607,
"acc_norm": 0.7960199004975125,
"acc_norm_stderr": 0.02849317624532607
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774707,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774707
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.030944459778533193,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.030944459778533193
},
"harness|truthfulqa:mc|0": {
"mc1": 0.27050183598531213,
"mc1_stderr": 0.015550778332842888,
"mc2": 0.40950657377856753,
"mc2_stderr": 0.013879529639480087
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_JosephusCheung__LL7M | 2023-10-10T15:28:17.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of JosephusCheung/LL7M
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [JosephusCheung/LL7M](https://huggingface.co/JosephusCheung/LL7M) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_JosephusCheung__LL7M\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T15:26:54.562937](https://huggingface.co/datasets/open-llm-leaderboard/details_JosephusCheung__LL7M/blob/main/results_2023-10-10T15-26-54.562937.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.34825966475221526,\n\
\ \"acc_stderr\": 0.03432792752430016,\n \"acc_norm\": 0.35202521117414026,\n\
\ \"acc_norm_stderr\": 0.03432431317156477,\n \"mc1\": 0.26560587515299877,\n\
\ \"mc1_stderr\": 0.015461027627253595,\n \"mc2\": 0.41389661402155314,\n\
\ \"mc2_stderr\": 0.014667249870313126\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4121160409556314,\n \"acc_stderr\": 0.014383915302225393,\n\
\ \"acc_norm\": 0.4496587030716723,\n \"acc_norm_stderr\": 0.014537144444284738\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5034853614817766,\n\
\ \"acc_stderr\": 0.004989660180792182,\n \"acc_norm\": 0.6881099382593109,\n\
\ \"acc_norm_stderr\": 0.004623184227344774\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.37777777777777777,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.37777777777777777,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.29605263157894735,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.29605263157894735,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.35,\n\
\ \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \
\ \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.35094339622641507,\n \"acc_stderr\": 0.029373646253234686,\n\
\ \"acc_norm\": 0.35094339622641507,\n \"acc_norm_stderr\": 0.029373646253234686\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3402777777777778,\n\
\ \"acc_stderr\": 0.03962135573486219,\n \"acc_norm\": 0.3402777777777778,\n\
\ \"acc_norm_stderr\": 0.03962135573486219\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.28,\n\
\ \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.34104046242774566,\n\
\ \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.34104046242774566,\n\
\ \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179963,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179963\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n\
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3446808510638298,\n \"acc_stderr\": 0.03106898596312215,\n\
\ \"acc_norm\": 0.3446808510638298,\n \"acc_norm_stderr\": 0.03106898596312215\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.04049339297748142,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.04049339297748142\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04082482904638629,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04082482904638629\n },\n\
\ \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25396825396825395,\n\
\ \"acc_stderr\": 0.022418042891113946,\n \"acc_norm\": 0.25396825396825395,\n\
\ \"acc_norm_stderr\": 0.022418042891113946\n },\n \"harness|hendrycksTest-formal_logic|5\"\
: {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.040061680838488774,\n\
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.040061680838488774\n\
\ },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.29,\n\
\ \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \
\ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-high_school_biology|5\"\
: {\n \"acc\": 0.3096774193548387,\n \"acc_stderr\": 0.026302774983517418,\n\
\ \"acc_norm\": 0.3096774193548387,\n \"acc_norm_stderr\": 0.026302774983517418\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.22660098522167488,\n \"acc_stderr\": 0.02945486383529297,\n \"\
acc_norm\": 0.22660098522167488,\n \"acc_norm_stderr\": 0.02945486383529297\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\"\
: 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.37575757575757573,\n \"acc_stderr\": 0.03781887353205982,\n\
\ \"acc_norm\": 0.37575757575757573,\n \"acc_norm_stderr\": 0.03781887353205982\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3282828282828283,\n \"acc_stderr\": 0.03345678422756779,\n \"\
acc_norm\": 0.3282828282828283,\n \"acc_norm_stderr\": 0.03345678422756779\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.41968911917098445,\n \"acc_stderr\": 0.035615873276858834,\n\
\ \"acc_norm\": 0.41968911917098445,\n \"acc_norm_stderr\": 0.035615873276858834\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.31025641025641026,\n \"acc_stderr\": 0.023454674889404288,\n\
\ \"acc_norm\": 0.31025641025641026,\n \"acc_norm_stderr\": 0.023454674889404288\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2518518518518518,\n \"acc_stderr\": 0.026466117538959912,\n \
\ \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.026466117538959912\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.02934457250063434,\n \
\ \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.02934457250063434\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.23178807947019867,\n \"acc_stderr\": 0.034454062719870546,\n \"\
acc_norm\": 0.23178807947019867,\n \"acc_norm_stderr\": 0.034454062719870546\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.4073394495412844,\n \"acc_stderr\": 0.021065986244412877,\n \"\
acc_norm\": 0.4073394495412844,\n \"acc_norm_stderr\": 0.021065986244412877\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.26851851851851855,\n \"acc_stderr\": 0.030225226160012414,\n \"\
acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.030225226160012414\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.29901960784313725,\n \"acc_stderr\": 0.03213325717373616,\n \"\
acc_norm\": 0.29901960784313725,\n \"acc_norm_stderr\": 0.03213325717373616\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.4936708860759494,\n \"acc_stderr\": 0.03254462010767859,\n \
\ \"acc_norm\": 0.4936708860759494,\n \"acc_norm_stderr\": 0.03254462010767859\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4304932735426009,\n\
\ \"acc_stderr\": 0.033231973029429394,\n \"acc_norm\": 0.4304932735426009,\n\
\ \"acc_norm_stderr\": 0.033231973029429394\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3282442748091603,\n \"acc_stderr\": 0.04118438565806298,\n\
\ \"acc_norm\": 0.3282442748091603,\n \"acc_norm_stderr\": 0.04118438565806298\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.3884297520661157,\n \"acc_stderr\": 0.04449270350068382,\n \"\
acc_norm\": 0.3884297520661157,\n \"acc_norm_stderr\": 0.04449270350068382\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.37037037037037035,\n\
\ \"acc_stderr\": 0.04668408033024932,\n \"acc_norm\": 0.37037037037037035,\n\
\ \"acc_norm_stderr\": 0.04668408033024932\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3374233128834356,\n \"acc_stderr\": 0.03714908409935575,\n\
\ \"acc_norm\": 0.3374233128834356,\n \"acc_norm_stderr\": 0.03714908409935575\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.4563106796116505,\n \"acc_stderr\": 0.049318019942204146,\n\
\ \"acc_norm\": 0.4563106796116505,\n \"acc_norm_stderr\": 0.049318019942204146\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.405982905982906,\n\
\ \"acc_stderr\": 0.03217180182641086,\n \"acc_norm\": 0.405982905982906,\n\
\ \"acc_norm_stderr\": 0.03217180182641086\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.454661558109834,\n\
\ \"acc_stderr\": 0.017806304585052602,\n \"acc_norm\": 0.454661558109834,\n\
\ \"acc_norm_stderr\": 0.017806304585052602\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.36127167630057805,\n \"acc_stderr\": 0.025862201852277875,\n\
\ \"acc_norm\": 0.36127167630057805,\n \"acc_norm_stderr\": 0.025862201852277875\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2346368715083799,\n\
\ \"acc_stderr\": 0.01417304409830368,\n \"acc_norm\": 0.2346368715083799,\n\
\ \"acc_norm_stderr\": 0.01417304409830368\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.35947712418300654,\n \"acc_stderr\": 0.027475969910660952,\n\
\ \"acc_norm\": 0.35947712418300654,\n \"acc_norm_stderr\": 0.027475969910660952\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3440514469453376,\n\
\ \"acc_stderr\": 0.026981478043648036,\n \"acc_norm\": 0.3440514469453376,\n\
\ \"acc_norm_stderr\": 0.026981478043648036\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.3734567901234568,\n \"acc_stderr\": 0.026915003011380157,\n\
\ \"acc_norm\": 0.3734567901234568,\n \"acc_norm_stderr\": 0.026915003011380157\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.28368794326241137,\n \"acc_stderr\": 0.026891709428343957,\n \
\ \"acc_norm\": 0.28368794326241137,\n \"acc_norm_stderr\": 0.026891709428343957\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2926988265971317,\n\
\ \"acc_stderr\": 0.01162094919584953,\n \"acc_norm\": 0.2926988265971317,\n\
\ \"acc_norm_stderr\": 0.01162094919584953\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.02952009569768777,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.02952009569768777\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.31699346405228757,\n \"acc_stderr\": 0.018824219512706204,\n \
\ \"acc_norm\": 0.31699346405228757,\n \"acc_norm_stderr\": 0.018824219512706204\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4090909090909091,\n\
\ \"acc_stderr\": 0.04709306978661895,\n \"acc_norm\": 0.4090909090909091,\n\
\ \"acc_norm_stderr\": 0.04709306978661895\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.03168091161233882,\n\
\ \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.03168091161233882\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.43283582089552236,\n\
\ \"acc_stderr\": 0.03503490923673282,\n \"acc_norm\": 0.43283582089552236,\n\
\ \"acc_norm_stderr\": 0.03503490923673282\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3674698795180723,\n\
\ \"acc_stderr\": 0.03753267402120574,\n \"acc_norm\": 0.3674698795180723,\n\
\ \"acc_norm_stderr\": 0.03753267402120574\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.4269005847953216,\n \"acc_stderr\": 0.03793620616529916,\n\
\ \"acc_norm\": 0.4269005847953216,\n \"acc_norm_stderr\": 0.03793620616529916\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26560587515299877,\n\
\ \"mc1_stderr\": 0.015461027627253595,\n \"mc2\": 0.41389661402155314,\n\
\ \"mc2_stderr\": 0.014667249870313126\n }\n}\n```"
repo_url: https://huggingface.co/JosephusCheung/LL7M
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|arc:challenge|25_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hellaswag|10_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T15-26-54.562937.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T15-26-54.562937.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T15-26-54.562937.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T15-26-54.562937.parquet'
- config_name: results
data_files:
- split: 2023_10_10T15_26_54.562937
path:
- results_2023-10-10T15-26-54.562937.parquet
- split: latest
path:
- results_2023-10-10T15-26-54.562937.parquet
---
# Dataset Card for Evaluation run of JosephusCheung/LL7M
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/JosephusCheung/LL7M
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [JosephusCheung/LL7M](https://huggingface.co/JosephusCheung/LL7M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_JosephusCheung__LL7M",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T15:26:54.562937](https://huggingface.co/datasets/open-llm-leaderboard/details_JosephusCheung__LL7M/blob/main/results_2023-10-10T15-26-54.562937.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.34825966475221526,
"acc_stderr": 0.03432792752430016,
"acc_norm": 0.35202521117414026,
"acc_norm_stderr": 0.03432431317156477,
"mc1": 0.26560587515299877,
"mc1_stderr": 0.015461027627253595,
"mc2": 0.41389661402155314,
"mc2_stderr": 0.014667249870313126
},
"harness|arc:challenge|25": {
"acc": 0.4121160409556314,
"acc_stderr": 0.014383915302225393,
"acc_norm": 0.4496587030716723,
"acc_norm_stderr": 0.014537144444284738
},
"harness|hellaswag|10": {
"acc": 0.5034853614817766,
"acc_stderr": 0.004989660180792182,
"acc_norm": 0.6881099382593109,
"acc_norm_stderr": 0.004623184227344774
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.29605263157894735,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.29605263157894735,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.35094339622641507,
"acc_stderr": 0.029373646253234686,
"acc_norm": 0.35094339622641507,
"acc_norm_stderr": 0.029373646253234686
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3402777777777778,
"acc_stderr": 0.03962135573486219,
"acc_norm": 0.3402777777777778,
"acc_norm_stderr": 0.03962135573486219
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.34104046242774566,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.34104046242774566,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179963,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179963
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3446808510638298,
"acc_stderr": 0.03106898596312215,
"acc_norm": 0.3446808510638298,
"acc_norm_stderr": 0.03106898596312215
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748142,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748142
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4,
"acc_stderr": 0.04082482904638629,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04082482904638629
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.022418042891113946,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.022418042891113946
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.040061680838488774,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.040061680838488774
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3096774193548387,
"acc_stderr": 0.026302774983517418,
"acc_norm": 0.3096774193548387,
"acc_norm_stderr": 0.026302774983517418
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.22660098522167488,
"acc_stderr": 0.02945486383529297,
"acc_norm": 0.22660098522167488,
"acc_norm_stderr": 0.02945486383529297
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.37575757575757573,
"acc_stderr": 0.03781887353205982,
"acc_norm": 0.37575757575757573,
"acc_norm_stderr": 0.03781887353205982
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3282828282828283,
"acc_stderr": 0.03345678422756779,
"acc_norm": 0.3282828282828283,
"acc_norm_stderr": 0.03345678422756779
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.41968911917098445,
"acc_stderr": 0.035615873276858834,
"acc_norm": 0.41968911917098445,
"acc_norm_stderr": 0.035615873276858834
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.31025641025641026,
"acc_stderr": 0.023454674889404288,
"acc_norm": 0.31025641025641026,
"acc_norm_stderr": 0.023454674889404288
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.026466117538959912,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.026466117538959912
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.02934457250063434,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.02934457250063434
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23178807947019867,
"acc_stderr": 0.034454062719870546,
"acc_norm": 0.23178807947019867,
"acc_norm_stderr": 0.034454062719870546
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.4073394495412844,
"acc_stderr": 0.021065986244412877,
"acc_norm": 0.4073394495412844,
"acc_norm_stderr": 0.021065986244412877
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.030225226160012414,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.030225226160012414
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.29901960784313725,
"acc_stderr": 0.03213325717373616,
"acc_norm": 0.29901960784313725,
"acc_norm_stderr": 0.03213325717373616
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4936708860759494,
"acc_stderr": 0.03254462010767859,
"acc_norm": 0.4936708860759494,
"acc_norm_stderr": 0.03254462010767859
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4304932735426009,
"acc_stderr": 0.033231973029429394,
"acc_norm": 0.4304932735426009,
"acc_norm_stderr": 0.033231973029429394
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3282442748091603,
"acc_stderr": 0.04118438565806298,
"acc_norm": 0.3282442748091603,
"acc_norm_stderr": 0.04118438565806298
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.3884297520661157,
"acc_stderr": 0.04449270350068382,
"acc_norm": 0.3884297520661157,
"acc_norm_stderr": 0.04449270350068382
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.04668408033024932,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.04668408033024932
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3374233128834356,
"acc_stderr": 0.03714908409935575,
"acc_norm": 0.3374233128834356,
"acc_norm_stderr": 0.03714908409935575
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.4563106796116505,
"acc_stderr": 0.049318019942204146,
"acc_norm": 0.4563106796116505,
"acc_norm_stderr": 0.049318019942204146
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.405982905982906,
"acc_stderr": 0.03217180182641086,
"acc_norm": 0.405982905982906,
"acc_norm_stderr": 0.03217180182641086
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.454661558109834,
"acc_stderr": 0.017806304585052602,
"acc_norm": 0.454661558109834,
"acc_norm_stderr": 0.017806304585052602
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.36127167630057805,
"acc_stderr": 0.025862201852277875,
"acc_norm": 0.36127167630057805,
"acc_norm_stderr": 0.025862201852277875
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2346368715083799,
"acc_stderr": 0.01417304409830368,
"acc_norm": 0.2346368715083799,
"acc_norm_stderr": 0.01417304409830368
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.35947712418300654,
"acc_stderr": 0.027475969910660952,
"acc_norm": 0.35947712418300654,
"acc_norm_stderr": 0.027475969910660952
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3440514469453376,
"acc_stderr": 0.026981478043648036,
"acc_norm": 0.3440514469453376,
"acc_norm_stderr": 0.026981478043648036
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3734567901234568,
"acc_stderr": 0.026915003011380157,
"acc_norm": 0.3734567901234568,
"acc_norm_stderr": 0.026915003011380157
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.28368794326241137,
"acc_stderr": 0.026891709428343957,
"acc_norm": 0.28368794326241137,
"acc_norm_stderr": 0.026891709428343957
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2926988265971317,
"acc_stderr": 0.01162094919584953,
"acc_norm": 0.2926988265971317,
"acc_norm_stderr": 0.01162094919584953
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.02952009569768777,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.02952009569768777
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.31699346405228757,
"acc_stderr": 0.018824219512706204,
"acc_norm": 0.31699346405228757,
"acc_norm_stderr": 0.018824219512706204
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4090909090909091,
"acc_stderr": 0.04709306978661895,
"acc_norm": 0.4090909090909091,
"acc_norm_stderr": 0.04709306978661895
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.03168091161233882,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.03168091161233882
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.43283582089552236,
"acc_stderr": 0.03503490923673282,
"acc_norm": 0.43283582089552236,
"acc_norm_stderr": 0.03503490923673282
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3674698795180723,
"acc_stderr": 0.03753267402120574,
"acc_norm": 0.3674698795180723,
"acc_norm_stderr": 0.03753267402120574
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.4269005847953216,
"acc_stderr": 0.03793620616529916,
"acc_norm": 0.4269005847953216,
"acc_norm_stderr": 0.03793620616529916
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26560587515299877,
"mc1_stderr": 0.015461027627253595,
"mc2": 0.41389661402155314,
"mc2_stderr": 0.014667249870313126
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
result-muse256-muse512-wuerst-sdv15/11cb7618 | 2023-10-10T15:28:26.000Z | [
"region:us"
] | result-muse256-muse512-wuerst-sdv15 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 219
num_examples: 10
download_size: 1429
dataset_size: 219
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "11cb7618"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
text2font/words_with_path_tags_version_2_train | 2023-10-10T20:31:19.000Z | [
"region:us"
] | text2font | null | null | null | 0 | 0 | Entry not found |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.