id stringlengths 2 115 | lastModified stringlengths 24 24 | tags list | author stringlengths 2 42 ⌀ | description stringlengths 0 68.7k ⌀ | citation stringlengths 0 10.7k ⌀ | cardData null | likes int64 0 3.55k | downloads int64 0 10.1M | card stringlengths 0 1.01M |
|---|---|---|---|---|---|---|---|---|---|
AnthonyRayo/AutomAssistPlugin | 2023-08-25T14:50:59.000Z | [
"region:us"
] | AnthonyRayo | null | null | null | 0 | 0 | Entry not found |
yardeny/tokenized_bert_context_len_128 | 2023-08-25T15:13:16.000Z | [
"region:us"
] | yardeny | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 12813417444
num_examples: 80462898
download_size: 4328077891
dataset_size: 12813417444
---
# Dataset Card for "tokenized_bert_context_len_128"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yardeny/tokenized_bert_context_len_64 | 2023-08-25T15:15:59.000Z | [
"region:us"
] | yardeny | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 10826495952
num_examples: 80462898
download_size: 3641089943
dataset_size: 10826495952
---
# Dataset Card for "tokenized_bert_context_len_64"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AndresR2909/finetuning_dataset_lamini | 2023-08-25T15:08:03.000Z | [
"region:us"
] | AndresR2909 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 2150284.5
num_examples: 1260
- name: test
num_bytes: 238920.5
num_examples: 140
download_size: 698665
dataset_size: 2389205.0
---
# Dataset Card for "finetuning_dataset_lamini"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AIHowto/kohyasd15andsdxlCharConfig | 2023-08-25T15:16:06.000Z | [
"license:creativeml-openrail-m",
"region:us"
] | AIHowto | null | null | null | 0 | 0 | ---
license: creativeml-openrail-m
---
|
yardeny/processed_bert_context_len_64 | 2023-08-25T15:35:39.000Z | [
"region:us"
] | yardeny | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 10153404360.0
num_examples: 25639910
download_size: 3558342060
dataset_size: 10153404360.0
---
# Dataset Card for "processed_bert_context_len_64"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_heegyu__WizardVicuna-open-llama-3b-v2 | 2023-08-27T12:42:28.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of heegyu/WizardVicuna-open-llama-3b-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [heegyu/WizardVicuna-open-llama-3b-v2](https://huggingface.co/heegyu/WizardVicuna-open-llama-3b-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_heegyu__WizardVicuna-open-llama-3b-v2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-25T15:20:35.003113](https://huggingface.co/datasets/open-llm-leaderboard/details_heegyu__WizardVicuna-open-llama-3b-v2/blob/main/results_2023-08-25T15%3A20%3A35.003113.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.27752117619785666,\n\
\ \"acc_stderr\": 0.032433048841838986,\n \"acc_norm\": 0.2807476521867636,\n\
\ \"acc_norm_stderr\": 0.03243195971215834,\n \"mc1\": 0.23990208078335373,\n\
\ \"mc1_stderr\": 0.014948812679062135,\n \"mc2\": 0.36797618702265905,\n\
\ \"mc2_stderr\": 0.014084676404213643\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3506825938566553,\n \"acc_stderr\": 0.01394463593072609,\n\
\ \"acc_norm\": 0.3771331058020478,\n \"acc_norm_stderr\": 0.014163366896192593\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5020912168890659,\n\
\ \"acc_stderr\": 0.004989737768749949,\n \"acc_norm\": 0.6660027882891855,\n\
\ \"acc_norm_stderr\": 0.004706748152125317\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2740740740740741,\n\
\ \"acc_stderr\": 0.03853254836552003,\n \"acc_norm\": 0.2740740740740741,\n\
\ \"acc_norm_stderr\": 0.03853254836552003\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.03690677986137282,\n\
\ \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.03690677986137282\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.35,\n\
\ \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \
\ \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2490566037735849,\n \"acc_stderr\": 0.026616482980501708,\n\
\ \"acc_norm\": 0.2490566037735849,\n \"acc_norm_stderr\": 0.026616482980501708\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.20833333333333334,\n\
\ \"acc_stderr\": 0.03396116205845334,\n \"acc_norm\": 0.20833333333333334,\n\
\ \"acc_norm_stderr\": 0.03396116205845334\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.32,\n\
\ \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2254335260115607,\n\
\ \"acc_stderr\": 0.031862098516411454,\n \"acc_norm\": 0.2254335260115607,\n\
\ \"acc_norm_stderr\": 0.031862098516411454\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.04336432707993177,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.04336432707993177\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n\
\ \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.20851063829787234,\n \"acc_stderr\": 0.026556982117838735,\n\
\ \"acc_norm\": 0.20851063829787234,\n \"acc_norm_stderr\": 0.026556982117838735\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.03999423879281336,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.03999423879281336\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3103448275862069,\n \"acc_stderr\": 0.03855289616378948,\n\
\ \"acc_norm\": 0.3103448275862069,\n \"acc_norm_stderr\": 0.03855289616378948\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25396825396825395,\n \"acc_stderr\": 0.022418042891113942,\n \"\
acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.022418042891113942\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n\
\ \"acc_stderr\": 0.03970158273235173,\n \"acc_norm\": 0.2698412698412698,\n\
\ \"acc_norm_stderr\": 0.03970158273235173\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.1967741935483871,\n \"acc_stderr\": 0.022616409420742018,\n \"\
acc_norm\": 0.1967741935483871,\n \"acc_norm_stderr\": 0.022616409420742018\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.28078817733990147,\n \"acc_stderr\": 0.031618563353586086,\n \"\
acc_norm\": 0.28078817733990147,\n \"acc_norm_stderr\": 0.031618563353586086\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2909090909090909,\n \"acc_stderr\": 0.03546563019624336,\n\
\ \"acc_norm\": 0.2909090909090909,\n \"acc_norm_stderr\": 0.03546563019624336\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.21717171717171718,\n \"acc_stderr\": 0.029376616484945627,\n \"\
acc_norm\": 0.21717171717171718,\n \"acc_norm_stderr\": 0.029376616484945627\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.21243523316062177,\n \"acc_stderr\": 0.02951928261681725,\n\
\ \"acc_norm\": 0.21243523316062177,\n \"acc_norm_stderr\": 0.02951928261681725\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2743589743589744,\n \"acc_stderr\": 0.022622765767493214,\n\
\ \"acc_norm\": 0.2743589743589744,\n \"acc_norm_stderr\": 0.022622765767493214\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2518518518518518,\n \"acc_stderr\": 0.02646611753895991,\n \
\ \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.02646611753895991\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23949579831932774,\n \"acc_stderr\": 0.027722065493361255,\n\
\ \"acc_norm\": 0.23949579831932774,\n \"acc_norm_stderr\": 0.027722065493361255\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22201834862385322,\n \"acc_stderr\": 0.01781884956479662,\n \"\
acc_norm\": 0.22201834862385322,\n \"acc_norm_stderr\": 0.01781884956479662\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.39351851851851855,\n \"acc_stderr\": 0.03331747876370312,\n \"\
acc_norm\": 0.39351851851851855,\n \"acc_norm_stderr\": 0.03331747876370312\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.19117647058823528,\n \"acc_stderr\": 0.027599174300640766,\n \"\
acc_norm\": 0.19117647058823528,\n \"acc_norm_stderr\": 0.027599174300640766\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3094170403587444,\n\
\ \"acc_stderr\": 0.031024411740572206,\n \"acc_norm\": 0.3094170403587444,\n\
\ \"acc_norm_stderr\": 0.031024411740572206\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.34710743801652894,\n \"acc_stderr\": 0.04345724570292535,\n \"\
acc_norm\": 0.34710743801652894,\n \"acc_norm_stderr\": 0.04345724570292535\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.04330043749650743,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.04330043749650743\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.24539877300613497,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.24539877300613497,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n\
\ \"acc_stderr\": 0.04157751539865629,\n \"acc_norm\": 0.25892857142857145,\n\
\ \"acc_norm_stderr\": 0.04157751539865629\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.20388349514563106,\n \"acc_stderr\": 0.03989139859531773,\n\
\ \"acc_norm\": 0.20388349514563106,\n \"acc_norm_stderr\": 0.03989139859531773\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2948717948717949,\n\
\ \"acc_stderr\": 0.029872577708891165,\n \"acc_norm\": 0.2948717948717949,\n\
\ \"acc_norm_stderr\": 0.029872577708891165\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2656449553001277,\n\
\ \"acc_stderr\": 0.015794302487888726,\n \"acc_norm\": 0.2656449553001277,\n\
\ \"acc_norm_stderr\": 0.015794302487888726\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.3063583815028902,\n \"acc_stderr\": 0.024818350129436593,\n\
\ \"acc_norm\": 0.3063583815028902,\n \"acc_norm_stderr\": 0.024818350129436593\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2973856209150327,\n \"acc_stderr\": 0.026173908506718576,\n\
\ \"acc_norm\": 0.2973856209150327,\n \"acc_norm_stderr\": 0.026173908506718576\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2829581993569132,\n\
\ \"acc_stderr\": 0.02558306248998482,\n \"acc_norm\": 0.2829581993569132,\n\
\ \"acc_norm_stderr\": 0.02558306248998482\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.023788583551658533,\n\
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.023788583551658533\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.29432624113475175,\n \"acc_stderr\": 0.027187127011503796,\n \
\ \"acc_norm\": 0.29432624113475175,\n \"acc_norm_stderr\": 0.027187127011503796\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2503259452411995,\n\
\ \"acc_stderr\": 0.01106415102716544,\n \"acc_norm\": 0.2503259452411995,\n\
\ \"acc_norm_stderr\": 0.01106415102716544\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3014705882352941,\n \"acc_stderr\": 0.027875982114273168,\n\
\ \"acc_norm\": 0.3014705882352941,\n \"acc_norm_stderr\": 0.027875982114273168\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.27450980392156865,\n \"acc_stderr\": 0.018054027458815198,\n \
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.018054027458815198\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2636363636363636,\n\
\ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.2636363636363636,\n\
\ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.40408163265306124,\n \"acc_stderr\": 0.03141470802586589,\n\
\ \"acc_norm\": 0.40408163265306124,\n \"acc_norm_stderr\": 0.03141470802586589\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.27860696517412936,\n\
\ \"acc_stderr\": 0.031700561834973086,\n \"acc_norm\": 0.27860696517412936,\n\
\ \"acc_norm_stderr\": 0.031700561834973086\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2710843373493976,\n\
\ \"acc_stderr\": 0.03460579907553027,\n \"acc_norm\": 0.2710843373493976,\n\
\ \"acc_norm_stderr\": 0.03460579907553027\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2807017543859649,\n \"acc_stderr\": 0.034462962170884265,\n\
\ \"acc_norm\": 0.2807017543859649,\n \"acc_norm_stderr\": 0.034462962170884265\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23990208078335373,\n\
\ \"mc1_stderr\": 0.014948812679062135,\n \"mc2\": 0.36797618702265905,\n\
\ \"mc2_stderr\": 0.014084676404213643\n }\n}\n```"
repo_url: https://huggingface.co/heegyu/WizardVicuna-open-llama-3b-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|arc:challenge|25_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hellaswag|10_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T15:20:35.003113.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T15:20:35.003113.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T15:20:35.003113.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T15:20:35.003113.parquet'
- config_name: results
data_files:
- split: 2023_08_25T15_20_35.003113
path:
- results_2023-08-25T15:20:35.003113.parquet
- split: latest
path:
- results_2023-08-25T15:20:35.003113.parquet
---
# Dataset Card for Evaluation run of heegyu/WizardVicuna-open-llama-3b-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/heegyu/WizardVicuna-open-llama-3b-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [heegyu/WizardVicuna-open-llama-3b-v2](https://huggingface.co/heegyu/WizardVicuna-open-llama-3b-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_heegyu__WizardVicuna-open-llama-3b-v2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-25T15:20:35.003113](https://huggingface.co/datasets/open-llm-leaderboard/details_heegyu__WizardVicuna-open-llama-3b-v2/blob/main/results_2023-08-25T15%3A20%3A35.003113.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.27752117619785666,
"acc_stderr": 0.032433048841838986,
"acc_norm": 0.2807476521867636,
"acc_norm_stderr": 0.03243195971215834,
"mc1": 0.23990208078335373,
"mc1_stderr": 0.014948812679062135,
"mc2": 0.36797618702265905,
"mc2_stderr": 0.014084676404213643
},
"harness|arc:challenge|25": {
"acc": 0.3506825938566553,
"acc_stderr": 0.01394463593072609,
"acc_norm": 0.3771331058020478,
"acc_norm_stderr": 0.014163366896192593
},
"harness|hellaswag|10": {
"acc": 0.5020912168890659,
"acc_stderr": 0.004989737768749949,
"acc_norm": 0.6660027882891855,
"acc_norm_stderr": 0.004706748152125317
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.03853254836552003,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.03853254836552003
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.03690677986137282,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.03690677986137282
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2490566037735849,
"acc_stderr": 0.026616482980501708,
"acc_norm": 0.2490566037735849,
"acc_norm_stderr": 0.026616482980501708
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.20833333333333334,
"acc_stderr": 0.03396116205845334,
"acc_norm": 0.20833333333333334,
"acc_norm_stderr": 0.03396116205845334
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.031862098516411454,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.031862098516411454
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.04336432707993177,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.04336432707993177
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.20851063829787234,
"acc_stderr": 0.026556982117838735,
"acc_norm": 0.20851063829787234,
"acc_norm_stderr": 0.026556982117838735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.03999423879281336,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.03999423879281336
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3103448275862069,
"acc_stderr": 0.03855289616378948,
"acc_norm": 0.3103448275862069,
"acc_norm_stderr": 0.03855289616378948
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.022418042891113942,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.022418042891113942
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.03970158273235173,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.03970158273235173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1967741935483871,
"acc_stderr": 0.022616409420742018,
"acc_norm": 0.1967741935483871,
"acc_norm_stderr": 0.022616409420742018
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.28078817733990147,
"acc_stderr": 0.031618563353586086,
"acc_norm": 0.28078817733990147,
"acc_norm_stderr": 0.031618563353586086
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2909090909090909,
"acc_stderr": 0.03546563019624336,
"acc_norm": 0.2909090909090909,
"acc_norm_stderr": 0.03546563019624336
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.21717171717171718,
"acc_stderr": 0.029376616484945627,
"acc_norm": 0.21717171717171718,
"acc_norm_stderr": 0.029376616484945627
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21243523316062177,
"acc_stderr": 0.02951928261681725,
"acc_norm": 0.21243523316062177,
"acc_norm_stderr": 0.02951928261681725
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2743589743589744,
"acc_stderr": 0.022622765767493214,
"acc_norm": 0.2743589743589744,
"acc_norm_stderr": 0.022622765767493214
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.02646611753895991,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.02646611753895991
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23949579831932774,
"acc_stderr": 0.027722065493361255,
"acc_norm": 0.23949579831932774,
"acc_norm_stderr": 0.027722065493361255
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22201834862385322,
"acc_stderr": 0.01781884956479662,
"acc_norm": 0.22201834862385322,
"acc_norm_stderr": 0.01781884956479662
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39351851851851855,
"acc_stderr": 0.03331747876370312,
"acc_norm": 0.39351851851851855,
"acc_norm_stderr": 0.03331747876370312
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.19117647058823528,
"acc_stderr": 0.027599174300640766,
"acc_norm": 0.19117647058823528,
"acc_norm_stderr": 0.027599174300640766
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3094170403587444,
"acc_stderr": 0.031024411740572206,
"acc_norm": 0.3094170403587444,
"acc_norm_stderr": 0.031024411740572206
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22900763358778625,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.22900763358778625,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.34710743801652894,
"acc_stderr": 0.04345724570292535,
"acc_norm": 0.34710743801652894,
"acc_norm_stderr": 0.04345724570292535
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04330043749650743,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04330043749650743
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.24539877300613497,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.24539877300613497,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25892857142857145,
"acc_stderr": 0.04157751539865629,
"acc_norm": 0.25892857142857145,
"acc_norm_stderr": 0.04157751539865629
},
"harness|hendrycksTest-management|5": {
"acc": 0.20388349514563106,
"acc_stderr": 0.03989139859531773,
"acc_norm": 0.20388349514563106,
"acc_norm_stderr": 0.03989139859531773
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2948717948717949,
"acc_stderr": 0.029872577708891165,
"acc_norm": 0.2948717948717949,
"acc_norm_stderr": 0.029872577708891165
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2656449553001277,
"acc_stderr": 0.015794302487888726,
"acc_norm": 0.2656449553001277,
"acc_norm_stderr": 0.015794302487888726
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3063583815028902,
"acc_stderr": 0.024818350129436593,
"acc_norm": 0.3063583815028902,
"acc_norm_stderr": 0.024818350129436593
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2973856209150327,
"acc_stderr": 0.026173908506718576,
"acc_norm": 0.2973856209150327,
"acc_norm_stderr": 0.026173908506718576
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2829581993569132,
"acc_stderr": 0.02558306248998482,
"acc_norm": 0.2829581993569132,
"acc_norm_stderr": 0.02558306248998482
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.023788583551658533,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.023788583551658533
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.29432624113475175,
"acc_stderr": 0.027187127011503796,
"acc_norm": 0.29432624113475175,
"acc_norm_stderr": 0.027187127011503796
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2503259452411995,
"acc_stderr": 0.01106415102716544,
"acc_norm": 0.2503259452411995,
"acc_norm_stderr": 0.01106415102716544
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3014705882352941,
"acc_stderr": 0.027875982114273168,
"acc_norm": 0.3014705882352941,
"acc_norm_stderr": 0.027875982114273168
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.018054027458815198,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.018054027458815198
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2636363636363636,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.2636363636363636,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.40408163265306124,
"acc_stderr": 0.03141470802586589,
"acc_norm": 0.40408163265306124,
"acc_norm_stderr": 0.03141470802586589
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.27860696517412936,
"acc_stderr": 0.031700561834973086,
"acc_norm": 0.27860696517412936,
"acc_norm_stderr": 0.031700561834973086
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2710843373493976,
"acc_stderr": 0.03460579907553027,
"acc_norm": 0.2710843373493976,
"acc_norm_stderr": 0.03460579907553027
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.034462962170884265,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.034462962170884265
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23990208078335373,
"mc1_stderr": 0.014948812679062135,
"mc2": 0.36797618702265905,
"mc2_stderr": 0.014084676404213643
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
abiyo27/BibleTTS_Ewe-Bible | 2023-08-27T21:23:10.000Z | [
"license:cc-by-sa-4.0",
"region:us"
] | abiyo27 | null | null | null | 0 | 0 | ---
license: cc-by-sa-4.0
---
|
OUX/temporal | 2023-08-25T15:29:28.000Z | [
"license:mit",
"region:us"
] | OUX | null | null | null | 0 | 0 | ---
license: mit
---
|
OUX/temporal_split | 2023-08-25T16:07:16.000Z | [
"license:apache-2.0",
"region:us"
] | OUX | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
Kutches/Danganronpa | 2023-08-25T17:06:13.000Z | [
"license:openrail",
"region:us"
] | Kutches | null | null | null | 0 | 0 | ---
license: openrail
---
|
Isaak-Carter/Wizzard-smol | 2023-08-25T16:12:13.000Z | [
"license:bigscience-openrail-m",
"region:us"
] | Isaak-Carter | null | null | null | 0 | 0 | ---
license: bigscience-openrail-m
---
|
X-CONG/CONG.v.01.2 | 2023-08-25T17:02:13.000Z | [
"region:us"
] | X-CONG | null | null | null | 0 | 0 | Entry not found |
Kiwihead15/github-issues | 2023-08-25T16:38:27.000Z | [
"region:us"
] | Kiwihead15 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: url
dtype: string
- name: repository_url
dtype: string
- name: labels_url
dtype: string
- name: comments_url
dtype: string
- name: events_url
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: user
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: labels
list:
- name: color
dtype: string
- name: default
dtype: bool
- name: description
dtype: string
- name: id
dtype: int64
- name: name
dtype: string
- name: node_id
dtype: string
- name: url
dtype: string
- name: state
dtype: string
- name: locked
dtype: bool
- name: assignee
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: assignees
list:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: milestone
struct:
- name: closed_at
dtype: string
- name: closed_issues
dtype: int64
- name: created_at
dtype: string
- name: creator
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: description
dtype: string
- name: due_on
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: labels_url
dtype: string
- name: node_id
dtype: string
- name: number
dtype: int64
- name: open_issues
dtype: int64
- name: state
dtype: string
- name: title
dtype: string
- name: updated_at
dtype: string
- name: url
dtype: string
- name: comments
sequence: string
- name: created_at
dtype: timestamp[ns, tz=UTC]
- name: updated_at
dtype: timestamp[ns, tz=UTC]
- name: closed_at
dtype: timestamp[ns, tz=UTC]
- name: author_association
dtype: string
- name: active_lock_reason
dtype: float64
- name: body
dtype: string
- name: reactions
struct:
- name: '+1'
dtype: int64
- name: '-1'
dtype: int64
- name: confused
dtype: int64
- name: eyes
dtype: int64
- name: heart
dtype: int64
- name: hooray
dtype: int64
- name: laugh
dtype: int64
- name: rocket
dtype: int64
- name: total_count
dtype: int64
- name: url
dtype: string
- name: timeline_url
dtype: string
- name: performed_via_github_app
dtype: float64
- name: state_reason
dtype: string
- name: draft
dtype: float64
- name: pull_request
struct:
- name: diff_url
dtype: string
- name: html_url
dtype: string
- name: merged_at
dtype: string
- name: patch_url
dtype: string
- name: url
dtype: string
- name: is_pull_request
dtype: bool
splits:
- name: train
num_bytes: 25627570
num_examples: 4500
download_size: 7330125
dataset_size: 25627570
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "github-issues"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_codellama__CodeLlama-13b-Python-hf | 2023-09-23T16:36:32.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of codellama/CodeLlama-13b-Python-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [codellama/CodeLlama-13b-Python-hf](https://huggingface.co/codellama/CodeLlama-13b-Python-hf)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_codellama__CodeLlama-13b-Python-hf\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T16:36:19.562140](https://huggingface.co/datasets/open-llm-leaderboard/details_codellama__CodeLlama-13b-Python-hf/blob/main/results_2023-09-23T16-36-19.562140.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0012583892617449664,\n\
\ \"em_stderr\": 0.00036305608931190423,\n \"f1\": 0.04866296140939616,\n\
\ \"f1_stderr\": 0.001201832323988023,\n \"acc\": 0.36839214132827663,\n\
\ \"acc_stderr\": 0.010571059008977151\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.00036305608931190423,\n\
\ \"f1\": 0.04866296140939616,\n \"f1_stderr\": 0.001201832323988023\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08642911296436695,\n \
\ \"acc_stderr\": 0.007740044337103802\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6503551696921863,\n \"acc_stderr\": 0.0134020736808505\n\
\ }\n}\n```"
repo_url: https://huggingface.co/codellama/CodeLlama-13b-Python-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|arc:challenge|25_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|arc:challenge|25_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T16_36_19.562140
path:
- '**/details_harness|drop|3_2023-09-23T16-36-19.562140.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T16-36-19.562140.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T16_36_19.562140
path:
- '**/details_harness|gsm8k|5_2023-09-23T16-36-19.562140.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T16-36-19.562140.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hellaswag|10_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hellaswag|10_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T16:41:17.923081.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T05:23:55.023532.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T16:41:17.923081.parquet'
- split: 2023_08_26T05_23_55.023532
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T05:23:55.023532.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T05:23:55.023532.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T16_36_19.562140
path:
- '**/details_harness|winogrande|5_2023-09-23T16-36-19.562140.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T16-36-19.562140.parquet'
- config_name: results
data_files:
- split: 2023_08_25T16_41_17.923081
path:
- results_2023-08-25T16:41:17.923081.parquet
- split: 2023_08_26T05_23_55.023532
path:
- results_2023-08-26T05:23:55.023532.parquet
- split: 2023_09_23T16_36_19.562140
path:
- results_2023-09-23T16-36-19.562140.parquet
- split: latest
path:
- results_2023-09-23T16-36-19.562140.parquet
---
# Dataset Card for Evaluation run of codellama/CodeLlama-13b-Python-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/codellama/CodeLlama-13b-Python-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [codellama/CodeLlama-13b-Python-hf](https://huggingface.co/codellama/CodeLlama-13b-Python-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_codellama__CodeLlama-13b-Python-hf",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T16:36:19.562140](https://huggingface.co/datasets/open-llm-leaderboard/details_codellama__CodeLlama-13b-Python-hf/blob/main/results_2023-09-23T16-36-19.562140.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0012583892617449664,
"em_stderr": 0.00036305608931190423,
"f1": 0.04866296140939616,
"f1_stderr": 0.001201832323988023,
"acc": 0.36839214132827663,
"acc_stderr": 0.010571059008977151
},
"harness|drop|3": {
"em": 0.0012583892617449664,
"em_stderr": 0.00036305608931190423,
"f1": 0.04866296140939616,
"f1_stderr": 0.001201832323988023
},
"harness|gsm8k|5": {
"acc": 0.08642911296436695,
"acc_stderr": 0.007740044337103802
},
"harness|winogrande|5": {
"acc": 0.6503551696921863,
"acc_stderr": 0.0134020736808505
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_codellama__CodeLlama-7b-Instruct-hf | 2023-08-27T12:42:33.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of codellama/CodeLlama-7b-Instruct-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [codellama/CodeLlama-7b-Instruct-hf](https://huggingface.co/codellama/CodeLlama-7b-Instruct-hf)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_codellama__CodeLlama-7b-Instruct-hf\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-26T03:58:42.829453](https://huggingface.co/datasets/open-llm-leaderboard/details_codellama__CodeLlama-7b-Instruct-hf/blob/main/results_2023-08-26T03%3A58%3A42.829453.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.36604873109894914,\n\
\ \"acc_stderr\": 0.03478996450781667,\n \"acc_norm\": 0.36902203313034215,\n\
\ \"acc_norm_stderr\": 0.03479254213817198,\n \"mc1\": 0.26438188494492043,\n\
\ \"mc1_stderr\": 0.015438211119522509,\n \"mc2\": 0.41449406477832335,\n\
\ \"mc2_stderr\": 0.014682516036163287\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.35665529010238906,\n \"acc_stderr\": 0.013998056902620199,\n\
\ \"acc_norm\": 0.38310580204778155,\n \"acc_norm_stderr\": 0.014206472661672881\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4442342162915754,\n\
\ \"acc_stderr\": 0.004958649623815333,\n \"acc_norm\": 0.5932085241983669,\n\
\ \"acc_norm_stderr\": 0.004902314055725614\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.37037037037037035,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.37037037037037035,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3026315789473684,\n \"acc_stderr\": 0.03738520676119668,\n\
\ \"acc_norm\": 0.3026315789473684,\n \"acc_norm_stderr\": 0.03738520676119668\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.35,\n\
\ \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.3622641509433962,\n \"acc_stderr\": 0.0295822451283843,\n\
\ \"acc_norm\": 0.3622641509433962,\n \"acc_norm_stderr\": 0.0295822451283843\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3402777777777778,\n\
\ \"acc_stderr\": 0.03962135573486219,\n \"acc_norm\": 0.3402777777777778,\n\
\ \"acc_norm_stderr\": 0.03962135573486219\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2774566473988439,\n\
\ \"acc_stderr\": 0.03414014007044036,\n \"acc_norm\": 0.2774566473988439,\n\
\ \"acc_norm_stderr\": 0.03414014007044036\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617746,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617746\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.37446808510638296,\n \"acc_stderr\": 0.03163910665367291,\n\
\ \"acc_norm\": 0.37446808510638296,\n \"acc_norm_stderr\": 0.03163910665367291\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n\
\ \"acc_stderr\": 0.044045561573747664,\n \"acc_norm\": 0.32456140350877194,\n\
\ \"acc_norm_stderr\": 0.044045561573747664\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.040824829046386284,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.040824829046386284\n \
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2962962962962963,\n \"acc_stderr\": 0.023517294335963286,\n \"\
acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.023517294335963286\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n\
\ \"acc_stderr\": 0.038932596106046734,\n \"acc_norm\": 0.25396825396825395,\n\
\ \"acc_norm_stderr\": 0.038932596106046734\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.38064516129032255,\n \"acc_stderr\": 0.02762171783290704,\n \"\
acc_norm\": 0.38064516129032255,\n \"acc_norm_stderr\": 0.02762171783290704\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2413793103448276,\n \"acc_stderr\": 0.030108330718011625,\n \"\
acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.030108330718011625\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.03453131801885415,\n\
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.03453131801885415\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.47474747474747475,\n \"acc_stderr\": 0.035578062450873145,\n \"\
acc_norm\": 0.47474747474747475,\n \"acc_norm_stderr\": 0.035578062450873145\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.41450777202072536,\n \"acc_stderr\": 0.03555300319557673,\n\
\ \"acc_norm\": 0.41450777202072536,\n \"acc_norm_stderr\": 0.03555300319557673\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.31794871794871793,\n \"acc_stderr\": 0.02361088430892786,\n\
\ \"acc_norm\": 0.31794871794871793,\n \"acc_norm_stderr\": 0.02361088430892786\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.36554621848739494,\n \"acc_stderr\": 0.031282177063684614,\n\
\ \"acc_norm\": 0.36554621848739494,\n \"acc_norm_stderr\": 0.031282177063684614\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763744,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763744\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.41100917431192663,\n \"acc_stderr\": 0.021095050687277638,\n \"\
acc_norm\": 0.41100917431192663,\n \"acc_norm_stderr\": 0.021095050687277638\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2916666666666667,\n \"acc_stderr\": 0.030998666304560538,\n \"\
acc_norm\": 0.2916666666666667,\n \"acc_norm_stderr\": 0.030998666304560538\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.3431372549019608,\n \"acc_stderr\": 0.033321399446680854,\n \"\
acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.033321399446680854\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.3459915611814346,\n \"acc_stderr\": 0.03096481058878671,\n \
\ \"acc_norm\": 0.3459915611814346,\n \"acc_norm_stderr\": 0.03096481058878671\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4349775784753363,\n\
\ \"acc_stderr\": 0.033272833702713445,\n \"acc_norm\": 0.4349775784753363,\n\
\ \"acc_norm_stderr\": 0.033272833702713445\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3969465648854962,\n \"acc_stderr\": 0.04291135671009224,\n\
\ \"acc_norm\": 0.3969465648854962,\n \"acc_norm_stderr\": 0.04291135671009224\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.4462809917355372,\n \"acc_stderr\": 0.0453793517794788,\n \"acc_norm\"\
: 0.4462809917355372,\n \"acc_norm_stderr\": 0.0453793517794788\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.0471282125742677,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.0471282125742677\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3987730061349693,\n \"acc_stderr\": 0.038470214204560246,\n\
\ \"acc_norm\": 0.3987730061349693,\n \"acc_norm_stderr\": 0.038470214204560246\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.04521829902833586,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.04521829902833586\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.46601941747572817,\n \"acc_stderr\": 0.0493929144727348,\n\
\ \"acc_norm\": 0.46601941747572817,\n \"acc_norm_stderr\": 0.0493929144727348\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.594017094017094,\n\
\ \"acc_stderr\": 0.03217180182641087,\n \"acc_norm\": 0.594017094017094,\n\
\ \"acc_norm_stderr\": 0.03217180182641087\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237101,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237101\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.45849297573435505,\n\
\ \"acc_stderr\": 0.017818248603465568,\n \"acc_norm\": 0.45849297573435505,\n\
\ \"acc_norm_stderr\": 0.017818248603465568\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.36127167630057805,\n \"acc_stderr\": 0.025862201852277868,\n\
\ \"acc_norm\": 0.36127167630057805,\n \"acc_norm_stderr\": 0.025862201852277868\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24022346368715083,\n\
\ \"acc_stderr\": 0.014288343803925286,\n \"acc_norm\": 0.24022346368715083,\n\
\ \"acc_norm_stderr\": 0.014288343803925286\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.369281045751634,\n \"acc_stderr\": 0.027634176689602656,\n\
\ \"acc_norm\": 0.369281045751634,\n \"acc_norm_stderr\": 0.027634176689602656\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.37942122186495175,\n\
\ \"acc_stderr\": 0.027559949802347824,\n \"acc_norm\": 0.37942122186495175,\n\
\ \"acc_norm_stderr\": 0.027559949802347824\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.41358024691358025,\n \"acc_stderr\": 0.027402042040269966,\n\
\ \"acc_norm\": 0.41358024691358025,\n \"acc_norm_stderr\": 0.027402042040269966\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2695035460992908,\n \"acc_stderr\": 0.026469036818590624,\n \
\ \"acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.026469036818590624\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.28096479791395046,\n\
\ \"acc_stderr\": 0.011479684550077675,\n \"acc_norm\": 0.28096479791395046,\n\
\ \"acc_norm_stderr\": 0.011479684550077675\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3860294117647059,\n \"acc_stderr\": 0.029573269134411124,\n\
\ \"acc_norm\": 0.3860294117647059,\n \"acc_norm_stderr\": 0.029573269134411124\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3022875816993464,\n \"acc_stderr\": 0.01857923271111388,\n \
\ \"acc_norm\": 0.3022875816993464,\n \"acc_norm_stderr\": 0.01857923271111388\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.509090909090909,\n\
\ \"acc_stderr\": 0.0478833976870286,\n \"acc_norm\": 0.509090909090909,\n\
\ \"acc_norm_stderr\": 0.0478833976870286\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3877551020408163,\n \"acc_stderr\": 0.03119223072679566,\n\
\ \"acc_norm\": 0.3877551020408163,\n \"acc_norm_stderr\": 0.03119223072679566\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.4527363184079602,\n\
\ \"acc_stderr\": 0.03519702717576915,\n \"acc_norm\": 0.4527363184079602,\n\
\ \"acc_norm_stderr\": 0.03519702717576915\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3674698795180723,\n\
\ \"acc_stderr\": 0.03753267402120574,\n \"acc_norm\": 0.3674698795180723,\n\
\ \"acc_norm_stderr\": 0.03753267402120574\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.038342347441649924,\n\
\ \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.038342347441649924\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26438188494492043,\n\
\ \"mc1_stderr\": 0.015438211119522509,\n \"mc2\": 0.41449406477832335,\n\
\ \"mc2_stderr\": 0.014682516036163287\n }\n}\n```"
repo_url: https://huggingface.co/codellama/CodeLlama-7b-Instruct-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|arc:challenge|25_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|arc:challenge|25_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hellaswag|10_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hellaswag|10_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T17:04:00.078187.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T03:58:42.829453.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T03:58:42.829453.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T17:04:00.078187.parquet'
- split: 2023_08_26T03_58_42.829453
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T03:58:42.829453.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T03:58:42.829453.parquet'
- config_name: results
data_files:
- split: 2023_08_25T17_04_00.078187
path:
- results_2023-08-25T17:04:00.078187.parquet
- split: 2023_08_26T03_58_42.829453
path:
- results_2023-08-26T03:58:42.829453.parquet
- split: latest
path:
- results_2023-08-26T03:58:42.829453.parquet
---
# Dataset Card for Evaluation run of codellama/CodeLlama-7b-Instruct-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/codellama/CodeLlama-7b-Instruct-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [codellama/CodeLlama-7b-Instruct-hf](https://huggingface.co/codellama/CodeLlama-7b-Instruct-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_codellama__CodeLlama-7b-Instruct-hf",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-26T03:58:42.829453](https://huggingface.co/datasets/open-llm-leaderboard/details_codellama__CodeLlama-7b-Instruct-hf/blob/main/results_2023-08-26T03%3A58%3A42.829453.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.36604873109894914,
"acc_stderr": 0.03478996450781667,
"acc_norm": 0.36902203313034215,
"acc_norm_stderr": 0.03479254213817198,
"mc1": 0.26438188494492043,
"mc1_stderr": 0.015438211119522509,
"mc2": 0.41449406477832335,
"mc2_stderr": 0.014682516036163287
},
"harness|arc:challenge|25": {
"acc": 0.35665529010238906,
"acc_stderr": 0.013998056902620199,
"acc_norm": 0.38310580204778155,
"acc_norm_stderr": 0.014206472661672881
},
"harness|hellaswag|10": {
"acc": 0.4442342162915754,
"acc_stderr": 0.004958649623815333,
"acc_norm": 0.5932085241983669,
"acc_norm_stderr": 0.004902314055725614
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3026315789473684,
"acc_stderr": 0.03738520676119668,
"acc_norm": 0.3026315789473684,
"acc_norm_stderr": 0.03738520676119668
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3622641509433962,
"acc_stderr": 0.0295822451283843,
"acc_norm": 0.3622641509433962,
"acc_norm_stderr": 0.0295822451283843
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3402777777777778,
"acc_stderr": 0.03962135573486219,
"acc_norm": 0.3402777777777778,
"acc_norm_stderr": 0.03962135573486219
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2774566473988439,
"acc_stderr": 0.03414014007044036,
"acc_norm": 0.2774566473988439,
"acc_norm_stderr": 0.03414014007044036
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617746,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617746
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.37446808510638296,
"acc_stderr": 0.03163910665367291,
"acc_norm": 0.37446808510638296,
"acc_norm_stderr": 0.03163910665367291
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.044045561573747664,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.044045561573747664
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4,
"acc_stderr": 0.040824829046386284,
"acc_norm": 0.4,
"acc_norm_stderr": 0.040824829046386284
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.023517294335963286,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.023517294335963286
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.038932596106046734,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.038932596106046734
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.38064516129032255,
"acc_stderr": 0.02762171783290704,
"acc_norm": 0.38064516129032255,
"acc_norm_stderr": 0.02762171783290704
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.030108330718011625,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.030108330718011625
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.03453131801885415,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.03453131801885415
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.47474747474747475,
"acc_stderr": 0.035578062450873145,
"acc_norm": 0.47474747474747475,
"acc_norm_stderr": 0.035578062450873145
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.41450777202072536,
"acc_stderr": 0.03555300319557673,
"acc_norm": 0.41450777202072536,
"acc_norm_stderr": 0.03555300319557673
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.31794871794871793,
"acc_stderr": 0.02361088430892786,
"acc_norm": 0.31794871794871793,
"acc_norm_stderr": 0.02361088430892786
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.36554621848739494,
"acc_stderr": 0.031282177063684614,
"acc_norm": 0.36554621848739494,
"acc_norm_stderr": 0.031282177063684614
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763744,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763744
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.41100917431192663,
"acc_stderr": 0.021095050687277638,
"acc_norm": 0.41100917431192663,
"acc_norm_stderr": 0.021095050687277638
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2916666666666667,
"acc_stderr": 0.030998666304560538,
"acc_norm": 0.2916666666666667,
"acc_norm_stderr": 0.030998666304560538
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.033321399446680854,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.033321399446680854
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3459915611814346,
"acc_stderr": 0.03096481058878671,
"acc_norm": 0.3459915611814346,
"acc_norm_stderr": 0.03096481058878671
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4349775784753363,
"acc_stderr": 0.033272833702713445,
"acc_norm": 0.4349775784753363,
"acc_norm_stderr": 0.033272833702713445
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3969465648854962,
"acc_stderr": 0.04291135671009224,
"acc_norm": 0.3969465648854962,
"acc_norm_stderr": 0.04291135671009224
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.4462809917355372,
"acc_stderr": 0.0453793517794788,
"acc_norm": 0.4462809917355372,
"acc_norm_stderr": 0.0453793517794788
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.0471282125742677,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.0471282125742677
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3987730061349693,
"acc_stderr": 0.038470214204560246,
"acc_norm": 0.3987730061349693,
"acc_norm_stderr": 0.038470214204560246
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833586,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833586
},
"harness|hendrycksTest-management|5": {
"acc": 0.46601941747572817,
"acc_stderr": 0.0493929144727348,
"acc_norm": 0.46601941747572817,
"acc_norm_stderr": 0.0493929144727348
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.594017094017094,
"acc_stderr": 0.03217180182641087,
"acc_norm": 0.594017094017094,
"acc_norm_stderr": 0.03217180182641087
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.45849297573435505,
"acc_stderr": 0.017818248603465568,
"acc_norm": 0.45849297573435505,
"acc_norm_stderr": 0.017818248603465568
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.36127167630057805,
"acc_stderr": 0.025862201852277868,
"acc_norm": 0.36127167630057805,
"acc_norm_stderr": 0.025862201852277868
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24022346368715083,
"acc_stderr": 0.014288343803925286,
"acc_norm": 0.24022346368715083,
"acc_norm_stderr": 0.014288343803925286
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.369281045751634,
"acc_stderr": 0.027634176689602656,
"acc_norm": 0.369281045751634,
"acc_norm_stderr": 0.027634176689602656
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.37942122186495175,
"acc_stderr": 0.027559949802347824,
"acc_norm": 0.37942122186495175,
"acc_norm_stderr": 0.027559949802347824
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.41358024691358025,
"acc_stderr": 0.027402042040269966,
"acc_norm": 0.41358024691358025,
"acc_norm_stderr": 0.027402042040269966
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2695035460992908,
"acc_stderr": 0.026469036818590624,
"acc_norm": 0.2695035460992908,
"acc_norm_stderr": 0.026469036818590624
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.28096479791395046,
"acc_stderr": 0.011479684550077675,
"acc_norm": 0.28096479791395046,
"acc_norm_stderr": 0.011479684550077675
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3860294117647059,
"acc_stderr": 0.029573269134411124,
"acc_norm": 0.3860294117647059,
"acc_norm_stderr": 0.029573269134411124
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3022875816993464,
"acc_stderr": 0.01857923271111388,
"acc_norm": 0.3022875816993464,
"acc_norm_stderr": 0.01857923271111388
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.509090909090909,
"acc_stderr": 0.0478833976870286,
"acc_norm": 0.509090909090909,
"acc_norm_stderr": 0.0478833976870286
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3877551020408163,
"acc_stderr": 0.03119223072679566,
"acc_norm": 0.3877551020408163,
"acc_norm_stderr": 0.03119223072679566
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.4527363184079602,
"acc_stderr": 0.03519702717576915,
"acc_norm": 0.4527363184079602,
"acc_norm_stderr": 0.03519702717576915
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3674698795180723,
"acc_stderr": 0.03753267402120574,
"acc_norm": 0.3674698795180723,
"acc_norm_stderr": 0.03753267402120574
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.038342347441649924,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.038342347441649924
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26438188494492043,
"mc1_stderr": 0.015438211119522509,
"mc2": 0.41449406477832335,
"mc2_stderr": 0.014682516036163287
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Nampromotion/Llama-2-RH | 2023-08-25T17:19:57.000Z | [
"region:us"
] | Nampromotion | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_codellama__CodeLlama-13b-Instruct-hf | 2023-08-27T12:42:35.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of codellama/CodeLlama-13b-Instruct-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [codellama/CodeLlama-13b-Instruct-hf](https://huggingface.co/codellama/CodeLlama-13b-Instruct-hf)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_codellama__CodeLlama-13b-Instruct-hf\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-25T17:15:30.693025](https://huggingface.co/datasets/open-llm-leaderboard/details_codellama__CodeLlama-13b-Instruct-hf/blob/main/results_2023-08-25T17%3A15%3A30.693025.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3907712317306086,\n\
\ \"acc_stderr\": 0.03516235193628555,\n \"acc_norm\": 0.39424213723863355,\n\
\ \"acc_norm_stderr\": 0.035161236947640194,\n \"mc1\": 0.2937576499388005,\n\
\ \"mc1_stderr\": 0.015945068581236614,\n \"mc2\": 0.45878663529563757,\n\
\ \"mc2_stderr\": 0.014860043549181953\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4087030716723549,\n \"acc_stderr\": 0.014365750345427006,\n\
\ \"acc_norm\": 0.4453924914675768,\n \"acc_norm_stderr\": 0.014523987638344085\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4811790479984067,\n\
\ \"acc_stderr\": 0.004986245115428458,\n \"acc_norm\": 0.6492730531766581,\n\
\ \"acc_norm_stderr\": 0.004762223492435257\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.32592592592592595,\n\
\ \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.32592592592592595,\n\
\ \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4144736842105263,\n \"acc_stderr\": 0.04008973785779206,\n\
\ \"acc_norm\": 0.4144736842105263,\n \"acc_norm_stderr\": 0.04008973785779206\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.37,\n\
\ \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \
\ \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.38113207547169814,\n \"acc_stderr\": 0.029890609686286637,\n\
\ \"acc_norm\": 0.38113207547169814,\n \"acc_norm_stderr\": 0.029890609686286637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3611111111111111,\n\
\ \"acc_stderr\": 0.04016660030451233,\n \"acc_norm\": 0.3611111111111111,\n\
\ \"acc_norm_stderr\": 0.04016660030451233\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3236994219653179,\n\
\ \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.3236994219653179,\n\
\ \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.046550104113196177,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.046550104113196177\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3617021276595745,\n \"acc_stderr\": 0.031410821975962414,\n\
\ \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.031410821975962414\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4206896551724138,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.4206896551724138,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2751322751322751,\n \"acc_stderr\": 0.02300008685906866,\n \"\
acc_norm\": 0.2751322751322751,\n \"acc_norm_stderr\": 0.02300008685906866\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.040735243221471255,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.040735243221471255\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.4064516129032258,\n \"acc_stderr\": 0.0279417273462563,\n \"acc_norm\"\
: 0.4064516129032258,\n \"acc_norm_stderr\": 0.0279417273462563\n },\n\
\ \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3054187192118227,\n\
\ \"acc_stderr\": 0.03240661565868408,\n \"acc_norm\": 0.3054187192118227,\n\
\ \"acc_norm_stderr\": 0.03240661565868408\n },\n \"harness|hendrycksTest-high_school_computer_science|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"\
acc\": 0.4,\n \"acc_stderr\": 0.03825460278380026,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.03825460278380026\n },\n \"harness|hendrycksTest-high_school_geography|5\"\
: {\n \"acc\": 0.5151515151515151,\n \"acc_stderr\": 0.03560716516531061,\n\
\ \"acc_norm\": 0.5151515151515151,\n \"acc_norm_stderr\": 0.03560716516531061\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5077720207253886,\n \"acc_stderr\": 0.03608003225569654,\n\
\ \"acc_norm\": 0.5077720207253886,\n \"acc_norm_stderr\": 0.03608003225569654\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.36923076923076925,\n \"acc_stderr\": 0.02446861524147892,\n\
\ \"acc_norm\": 0.36923076923076925,\n \"acc_norm_stderr\": 0.02446861524147892\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2222222222222222,\n \"acc_stderr\": 0.025348097468097863,\n \
\ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.025348097468097863\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.37815126050420167,\n \"acc_stderr\": 0.03149930577784906,\n\
\ \"acc_norm\": 0.37815126050420167,\n \"acc_norm_stderr\": 0.03149930577784906\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.4990825688073395,\n \"acc_stderr\": 0.021437287056051215,\n \"\
acc_norm\": 0.4990825688073395,\n \"acc_norm_stderr\": 0.021437287056051215\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.37962962962962965,\n \"acc_stderr\": 0.03309682581119035,\n \"\
acc_norm\": 0.37962962962962965,\n \"acc_norm_stderr\": 0.03309682581119035\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.39705882352941174,\n \"acc_stderr\": 0.034341311647191286,\n \"\
acc_norm\": 0.39705882352941174,\n \"acc_norm_stderr\": 0.034341311647191286\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.38396624472573837,\n \"acc_stderr\": 0.031658678064106674,\n \
\ \"acc_norm\": 0.38396624472573837,\n \"acc_norm_stderr\": 0.031658678064106674\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4260089686098655,\n\
\ \"acc_stderr\": 0.033188332862172806,\n \"acc_norm\": 0.4260089686098655,\n\
\ \"acc_norm_stderr\": 0.033188332862172806\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4198473282442748,\n \"acc_stderr\": 0.04328577215262972,\n\
\ \"acc_norm\": 0.4198473282442748,\n \"acc_norm_stderr\": 0.04328577215262972\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.38016528925619836,\n \"acc_stderr\": 0.04431324501968431,\n \"\
acc_norm\": 0.38016528925619836,\n \"acc_norm_stderr\": 0.04431324501968431\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4537037037037037,\n\
\ \"acc_stderr\": 0.04812917324536823,\n \"acc_norm\": 0.4537037037037037,\n\
\ \"acc_norm_stderr\": 0.04812917324536823\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4049079754601227,\n \"acc_stderr\": 0.038566721635489125,\n\
\ \"acc_norm\": 0.4049079754601227,\n \"acc_norm_stderr\": 0.038566721635489125\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.4563106796116505,\n \"acc_stderr\": 0.049318019942204146,\n\
\ \"acc_norm\": 0.4563106796116505,\n \"acc_norm_stderr\": 0.049318019942204146\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7051282051282052,\n\
\ \"acc_stderr\": 0.02987257770889117,\n \"acc_norm\": 0.7051282051282052,\n\
\ \"acc_norm_stderr\": 0.02987257770889117\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562427,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562427\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4955300127713921,\n\
\ \"acc_stderr\": 0.017879248970584377,\n \"acc_norm\": 0.4955300127713921,\n\
\ \"acc_norm_stderr\": 0.017879248970584377\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.3468208092485549,\n \"acc_stderr\": 0.025624723994030457,\n\
\ \"acc_norm\": 0.3468208092485549,\n \"acc_norm_stderr\": 0.025624723994030457\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2681564245810056,\n\
\ \"acc_stderr\": 0.014816119635317003,\n \"acc_norm\": 0.2681564245810056,\n\
\ \"acc_norm_stderr\": 0.014816119635317003\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.027956046165424516,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.027956046165424516\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.40836012861736337,\n\
\ \"acc_stderr\": 0.027917050748484634,\n \"acc_norm\": 0.40836012861736337,\n\
\ \"acc_norm_stderr\": 0.027917050748484634\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.02712511551316686,\n\
\ \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.02712511551316686\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3049645390070922,\n \"acc_stderr\": 0.027464708442022128,\n \
\ \"acc_norm\": 0.3049645390070922,\n \"acc_norm_stderr\": 0.027464708442022128\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2985658409387223,\n\
\ \"acc_stderr\": 0.011688060141794208,\n \"acc_norm\": 0.2985658409387223,\n\
\ \"acc_norm_stderr\": 0.011688060141794208\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.34558823529411764,\n \"acc_stderr\": 0.028888193103988647,\n\
\ \"acc_norm\": 0.34558823529411764,\n \"acc_norm_stderr\": 0.028888193103988647\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3284313725490196,\n \"acc_stderr\": 0.01899970738316267,\n \
\ \"acc_norm\": 0.3284313725490196,\n \"acc_norm_stderr\": 0.01899970738316267\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4727272727272727,\n\
\ \"acc_stderr\": 0.04782001791380063,\n \"acc_norm\": 0.4727272727272727,\n\
\ \"acc_norm_stderr\": 0.04782001791380063\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.43673469387755104,\n \"acc_stderr\": 0.03175195237583323,\n\
\ \"acc_norm\": 0.43673469387755104,\n \"acc_norm_stderr\": 0.03175195237583323\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.4577114427860697,\n\
\ \"acc_stderr\": 0.035228658640995975,\n \"acc_norm\": 0.4577114427860697,\n\
\ \"acc_norm_stderr\": 0.035228658640995975\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39759036144578314,\n\
\ \"acc_stderr\": 0.038099730845402184,\n \"acc_norm\": 0.39759036144578314,\n\
\ \"acc_norm_stderr\": 0.038099730845402184\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.4678362573099415,\n \"acc_stderr\": 0.038268824176603676,\n\
\ \"acc_norm\": 0.4678362573099415,\n \"acc_norm_stderr\": 0.038268824176603676\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2937576499388005,\n\
\ \"mc1_stderr\": 0.015945068581236614,\n \"mc2\": 0.45878663529563757,\n\
\ \"mc2_stderr\": 0.014860043549181953\n }\n}\n```"
repo_url: https://huggingface.co/codellama/CodeLlama-13b-Instruct-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|arc:challenge|25_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hellaswag|10_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T17:15:30.693025.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T17:15:30.693025.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T17:15:30.693025.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T17:15:30.693025.parquet'
- config_name: results
data_files:
- split: 2023_08_25T17_15_30.693025
path:
- results_2023-08-25T17:15:30.693025.parquet
- split: latest
path:
- results_2023-08-25T17:15:30.693025.parquet
---
# Dataset Card for Evaluation run of codellama/CodeLlama-13b-Instruct-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/codellama/CodeLlama-13b-Instruct-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [codellama/CodeLlama-13b-Instruct-hf](https://huggingface.co/codellama/CodeLlama-13b-Instruct-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_codellama__CodeLlama-13b-Instruct-hf",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-25T17:15:30.693025](https://huggingface.co/datasets/open-llm-leaderboard/details_codellama__CodeLlama-13b-Instruct-hf/blob/main/results_2023-08-25T17%3A15%3A30.693025.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3907712317306086,
"acc_stderr": 0.03516235193628555,
"acc_norm": 0.39424213723863355,
"acc_norm_stderr": 0.035161236947640194,
"mc1": 0.2937576499388005,
"mc1_stderr": 0.015945068581236614,
"mc2": 0.45878663529563757,
"mc2_stderr": 0.014860043549181953
},
"harness|arc:challenge|25": {
"acc": 0.4087030716723549,
"acc_stderr": 0.014365750345427006,
"acc_norm": 0.4453924914675768,
"acc_norm_stderr": 0.014523987638344085
},
"harness|hellaswag|10": {
"acc": 0.4811790479984067,
"acc_stderr": 0.004986245115428458,
"acc_norm": 0.6492730531766581,
"acc_norm_stderr": 0.004762223492435257
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4144736842105263,
"acc_stderr": 0.04008973785779206,
"acc_norm": 0.4144736842105263,
"acc_norm_stderr": 0.04008973785779206
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.38113207547169814,
"acc_stderr": 0.029890609686286637,
"acc_norm": 0.38113207547169814,
"acc_norm_stderr": 0.029890609686286637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3611111111111111,
"acc_stderr": 0.04016660030451233,
"acc_norm": 0.3611111111111111,
"acc_norm_stderr": 0.04016660030451233
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3236994219653179,
"acc_stderr": 0.0356760379963917,
"acc_norm": 0.3236994219653179,
"acc_norm_stderr": 0.0356760379963917
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.046550104113196177,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.046550104113196177
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3617021276595745,
"acc_stderr": 0.031410821975962414,
"acc_norm": 0.3617021276595745,
"acc_norm_stderr": 0.031410821975962414
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4206896551724138,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.4206896551724138,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2751322751322751,
"acc_stderr": 0.02300008685906866,
"acc_norm": 0.2751322751322751,
"acc_norm_stderr": 0.02300008685906866
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.040735243221471255,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.040735243221471255
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4064516129032258,
"acc_stderr": 0.0279417273462563,
"acc_norm": 0.4064516129032258,
"acc_norm_stderr": 0.0279417273462563
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3054187192118227,
"acc_stderr": 0.03240661565868408,
"acc_norm": 0.3054187192118227,
"acc_norm_stderr": 0.03240661565868408
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.4,
"acc_stderr": 0.03825460278380026,
"acc_norm": 0.4,
"acc_norm_stderr": 0.03825460278380026
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5151515151515151,
"acc_stderr": 0.03560716516531061,
"acc_norm": 0.5151515151515151,
"acc_norm_stderr": 0.03560716516531061
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5077720207253886,
"acc_stderr": 0.03608003225569654,
"acc_norm": 0.5077720207253886,
"acc_norm_stderr": 0.03608003225569654
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.36923076923076925,
"acc_stderr": 0.02446861524147892,
"acc_norm": 0.36923076923076925,
"acc_norm_stderr": 0.02446861524147892
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.025348097468097863,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.025348097468097863
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.37815126050420167,
"acc_stderr": 0.03149930577784906,
"acc_norm": 0.37815126050420167,
"acc_norm_stderr": 0.03149930577784906
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.4990825688073395,
"acc_stderr": 0.021437287056051215,
"acc_norm": 0.4990825688073395,
"acc_norm_stderr": 0.021437287056051215
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.37962962962962965,
"acc_stderr": 0.03309682581119035,
"acc_norm": 0.37962962962962965,
"acc_norm_stderr": 0.03309682581119035
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.39705882352941174,
"acc_stderr": 0.034341311647191286,
"acc_norm": 0.39705882352941174,
"acc_norm_stderr": 0.034341311647191286
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.38396624472573837,
"acc_stderr": 0.031658678064106674,
"acc_norm": 0.38396624472573837,
"acc_norm_stderr": 0.031658678064106674
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4260089686098655,
"acc_stderr": 0.033188332862172806,
"acc_norm": 0.4260089686098655,
"acc_norm_stderr": 0.033188332862172806
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4198473282442748,
"acc_stderr": 0.04328577215262972,
"acc_norm": 0.4198473282442748,
"acc_norm_stderr": 0.04328577215262972
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.38016528925619836,
"acc_stderr": 0.04431324501968431,
"acc_norm": 0.38016528925619836,
"acc_norm_stderr": 0.04431324501968431
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.04812917324536823,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.04812917324536823
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4049079754601227,
"acc_stderr": 0.038566721635489125,
"acc_norm": 0.4049079754601227,
"acc_norm_stderr": 0.038566721635489125
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.4563106796116505,
"acc_stderr": 0.049318019942204146,
"acc_norm": 0.4563106796116505,
"acc_norm_stderr": 0.049318019942204146
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7051282051282052,
"acc_stderr": 0.02987257770889117,
"acc_norm": 0.7051282051282052,
"acc_norm_stderr": 0.02987257770889117
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562427,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562427
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.4955300127713921,
"acc_stderr": 0.017879248970584377,
"acc_norm": 0.4955300127713921,
"acc_norm_stderr": 0.017879248970584377
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3468208092485549,
"acc_stderr": 0.025624723994030457,
"acc_norm": 0.3468208092485549,
"acc_norm_stderr": 0.025624723994030457
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2681564245810056,
"acc_stderr": 0.014816119635317003,
"acc_norm": 0.2681564245810056,
"acc_norm_stderr": 0.014816119635317003
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.027956046165424516,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.027956046165424516
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.40836012861736337,
"acc_stderr": 0.027917050748484634,
"acc_norm": 0.40836012861736337,
"acc_norm_stderr": 0.027917050748484634
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.02712511551316686,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.02712511551316686
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3049645390070922,
"acc_stderr": 0.027464708442022128,
"acc_norm": 0.3049645390070922,
"acc_norm_stderr": 0.027464708442022128
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2985658409387223,
"acc_stderr": 0.011688060141794208,
"acc_norm": 0.2985658409387223,
"acc_norm_stderr": 0.011688060141794208
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.34558823529411764,
"acc_stderr": 0.028888193103988647,
"acc_norm": 0.34558823529411764,
"acc_norm_stderr": 0.028888193103988647
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3284313725490196,
"acc_stderr": 0.01899970738316267,
"acc_norm": 0.3284313725490196,
"acc_norm_stderr": 0.01899970738316267
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4727272727272727,
"acc_stderr": 0.04782001791380063,
"acc_norm": 0.4727272727272727,
"acc_norm_stderr": 0.04782001791380063
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.43673469387755104,
"acc_stderr": 0.03175195237583323,
"acc_norm": 0.43673469387755104,
"acc_norm_stderr": 0.03175195237583323
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.4577114427860697,
"acc_stderr": 0.035228658640995975,
"acc_norm": 0.4577114427860697,
"acc_norm_stderr": 0.035228658640995975
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39759036144578314,
"acc_stderr": 0.038099730845402184,
"acc_norm": 0.39759036144578314,
"acc_norm_stderr": 0.038099730845402184
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.4678362573099415,
"acc_stderr": 0.038268824176603676,
"acc_norm": 0.4678362573099415,
"acc_norm_stderr": 0.038268824176603676
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2937576499388005,
"mc1_stderr": 0.015945068581236614,
"mc2": 0.45878663529563757,
"mc2_stderr": 0.014860043549181953
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
KasparZ/cyborg | 2023-09-02T16:39:12.000Z | [
"region:us"
] | KasparZ | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 818748
num_examples: 1442
download_size: 492101
dataset_size: 818748
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "cyborg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
exposetobacco/Amazon_Reviews | 2023-08-25T18:16:52.000Z | [
"region:us"
] | exposetobacco | null | null | null | 0 | 0 | The dataset is derived from Amazon Reviews with three sentiments 0: negative, 1:neutral and 2 positive.
The dataset is balanced and training to testing is 90:10%.
The dataset can be used for Three sentiments LLM fine Tuning. |
typevoid/german-company-addresses | 2023-08-26T10:28:58.000Z | [
"region:us"
] | typevoid | null | null | null | 1 | 0 | ---
# For reference on model card metadata, see the spec: https://github.com/huggingface/hub-docs/blob/main/datasetcard.md?plain=1
# Doc / guide: https://huggingface.co/docs/hub/datasets-cards
{}
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
marup/TakahashiAmatoSongRVC | 2023-08-25T18:24:44.000Z | [
"license:openrail",
"region:us"
] | marup | null | null | null | 0 | 0 | ---
license: openrail
---
|
ZiAngGu/omni3d_v5 | 2023-08-25T18:24:35.000Z | [
"region:us"
] | ZiAngGu | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_Deci__DeciCoder-1b | 2023-08-27T12:42:37.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Deci/DeciCoder-1b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Deci/DeciCoder-1b](https://huggingface.co/Deci/DeciCoder-1b) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Deci__DeciCoder-1b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-25T18:42:07.989702](https://huggingface.co/datasets/open-llm-leaderboard/details_Deci__DeciCoder-1b/blob/main/results_2023-08-25T18%3A42%3A07.989702.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24270247836868492,\n\
\ \"acc_stderr\": 0.03126714039709406,\n \"acc_norm\": 0.24404952511664843,\n\
\ \"acc_norm_stderr\": 0.03128980515853274,\n \"mc1\": 0.2582619339045288,\n\
\ \"mc1_stderr\": 0.015321821688476194,\n \"mc2\": 0.4705381335286149,\n\
\ \"mc2_stderr\": 0.015491012979962984\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.16040955631399317,\n \"acc_stderr\": 0.010724336059110964,\n\
\ \"acc_norm\": 0.21160409556313994,\n \"acc_norm_stderr\": 0.011935916358632875\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2826130252937662,\n\
\ \"acc_stderr\": 0.004493495872000129,\n \"acc_norm\": 0.31089424417446726,\n\
\ \"acc_norm_stderr\": 0.004619136497359843\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.03785714465066654,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.03785714465066654\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.035834961763610645,\n\
\ \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.035834961763610645\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.33,\n\
\ \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \
\ \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.024618298195866518,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.024618298195866518\n \
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.19,\n \"acc_stderr\": 0.039427724440366255,\n \"acc_norm\": 0.19,\n\
\ \"acc_norm_stderr\": 0.039427724440366255\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2254335260115607,\n\
\ \"acc_stderr\": 0.03186209851641144,\n \"acc_norm\": 0.2254335260115607,\n\
\ \"acc_norm_stderr\": 0.03186209851641144\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.28085106382978725,\n \"acc_stderr\": 0.02937917046412482,\n\
\ \"acc_norm\": 0.28085106382978725,\n \"acc_norm_stderr\": 0.02937917046412482\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.20175438596491227,\n\
\ \"acc_stderr\": 0.037752050135836386,\n \"acc_norm\": 0.20175438596491227,\n\
\ \"acc_norm_stderr\": 0.037752050135836386\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.22758620689655173,\n \"acc_stderr\": 0.03493950380131184,\n\
\ \"acc_norm\": 0.22758620689655173,\n \"acc_norm_stderr\": 0.03493950380131184\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2328042328042328,\n \"acc_stderr\": 0.02176596167215453,\n \"\
acc_norm\": 0.2328042328042328,\n \"acc_norm_stderr\": 0.02176596167215453\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n\
\ \"acc_stderr\": 0.03970158273235173,\n \"acc_norm\": 0.2698412698412698,\n\
\ \"acc_norm_stderr\": 0.03970158273235173\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.17419354838709677,\n\
\ \"acc_stderr\": 0.021576248184514583,\n \"acc_norm\": 0.17419354838709677,\n\
\ \"acc_norm_stderr\": 0.021576248184514583\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.18719211822660098,\n \"acc_stderr\": 0.027444924966882618,\n\
\ \"acc_norm\": 0.18719211822660098,\n \"acc_norm_stderr\": 0.027444924966882618\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.23030303030303031,\n \"acc_stderr\": 0.03287666758603488,\n\
\ \"acc_norm\": 0.23030303030303031,\n \"acc_norm_stderr\": 0.03287666758603488\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.20707070707070707,\n \"acc_stderr\": 0.02886977846026705,\n \"\
acc_norm\": 0.20707070707070707,\n \"acc_norm_stderr\": 0.02886977846026705\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.20725388601036268,\n \"acc_stderr\": 0.029252823291803617,\n\
\ \"acc_norm\": 0.20725388601036268,\n \"acc_norm_stderr\": 0.029252823291803617\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.19487179487179487,\n \"acc_stderr\": 0.02008316759518139,\n\
\ \"acc_norm\": 0.19487179487179487,\n \"acc_norm_stderr\": 0.02008316759518139\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24444444444444444,\n \"acc_stderr\": 0.026202766534652148,\n \
\ \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.026202766534652148\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.19327731092436976,\n \"acc_stderr\": 0.02564947026588919,\n\
\ \"acc_norm\": 0.19327731092436976,\n \"acc_norm_stderr\": 0.02564947026588919\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.26490066225165565,\n \"acc_stderr\": 0.03603038545360384,\n \"\
acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.03603038545360384\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1908256880733945,\n \"acc_stderr\": 0.016847676400091112,\n \"\
acc_norm\": 0.1908256880733945,\n \"acc_norm_stderr\": 0.016847676400091112\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1712962962962963,\n \"acc_stderr\": 0.025695341643824688,\n \"\
acc_norm\": 0.1712962962962963,\n \"acc_norm_stderr\": 0.025695341643824688\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25980392156862747,\n \"acc_stderr\": 0.030778554678693268,\n \"\
acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.030778554678693268\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3094170403587444,\n\
\ \"acc_stderr\": 0.031024411740572206,\n \"acc_norm\": 0.3094170403587444,\n\
\ \"acc_norm_stderr\": 0.031024411740572206\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2824427480916031,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.2824427480916031,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.256198347107438,\n \"acc_stderr\": 0.03984979653302871,\n \"acc_norm\"\
: 0.256198347107438,\n \"acc_norm_stderr\": 0.03984979653302871\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.23148148148148148,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.23148148148148148,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.26380368098159507,\n \"acc_stderr\": 0.03462419931615623,\n\
\ \"acc_norm\": 0.26380368098159507,\n \"acc_norm_stderr\": 0.03462419931615623\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n\
\ \"acc_stderr\": 0.041577515398656284,\n \"acc_norm\": 0.25892857142857145,\n\
\ \"acc_norm_stderr\": 0.041577515398656284\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.18446601941747573,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.18446601941747573,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.28205128205128205,\n\
\ \"acc_stderr\": 0.02948036054954119,\n \"acc_norm\": 0.28205128205128205,\n\
\ \"acc_norm_stderr\": 0.02948036054954119\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2796934865900383,\n\
\ \"acc_stderr\": 0.01605079214803655,\n \"acc_norm\": 0.2796934865900383,\n\
\ \"acc_norm_stderr\": 0.01605079214803655\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2398843930635838,\n \"acc_stderr\": 0.022989592543123567,\n\
\ \"acc_norm\": 0.2398843930635838,\n \"acc_norm_stderr\": 0.022989592543123567\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.21895424836601307,\n \"acc_stderr\": 0.02367908986180772,\n\
\ \"acc_norm\": 0.21895424836601307,\n \"acc_norm_stderr\": 0.02367908986180772\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2829581993569132,\n\
\ \"acc_stderr\": 0.025583062489984827,\n \"acc_norm\": 0.2829581993569132,\n\
\ \"acc_norm_stderr\": 0.025583062489984827\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.24382716049382716,\n \"acc_stderr\": 0.023891879541959614,\n\
\ \"acc_norm\": 0.24382716049382716,\n \"acc_norm_stderr\": 0.023891879541959614\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.25177304964539005,\n \"acc_stderr\": 0.025892151156709405,\n \
\ \"acc_norm\": 0.25177304964539005,\n \"acc_norm_stderr\": 0.025892151156709405\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24185136897001303,\n\
\ \"acc_stderr\": 0.010936550813827066,\n \"acc_norm\": 0.24185136897001303,\n\
\ \"acc_norm_stderr\": 0.010936550813827066\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.17279411764705882,\n \"acc_stderr\": 0.022966067585581788,\n\
\ \"acc_norm\": 0.17279411764705882,\n \"acc_norm_stderr\": 0.022966067585581788\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2826797385620915,\n \"acc_stderr\": 0.018217269552053435,\n \
\ \"acc_norm\": 0.2826797385620915,\n \"acc_norm_stderr\": 0.018217269552053435\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.24545454545454545,\n\
\ \"acc_stderr\": 0.04122066502878284,\n \"acc_norm\": 0.24545454545454545,\n\
\ \"acc_norm_stderr\": 0.04122066502878284\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.19591836734693877,\n \"acc_stderr\": 0.025409301953225678,\n\
\ \"acc_norm\": 0.19591836734693877,\n \"acc_norm_stderr\": 0.025409301953225678\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.25870646766169153,\n\
\ \"acc_stderr\": 0.030965903123573012,\n \"acc_norm\": 0.25870646766169153,\n\
\ \"acc_norm_stderr\": 0.030965903123573012\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.25301204819277107,\n\
\ \"acc_stderr\": 0.03384429155233135,\n \"acc_norm\": 0.25301204819277107,\n\
\ \"acc_norm_stderr\": 0.03384429155233135\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2982456140350877,\n \"acc_stderr\": 0.03508771929824563,\n\
\ \"acc_norm\": 0.2982456140350877,\n \"acc_norm_stderr\": 0.03508771929824563\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2582619339045288,\n\
\ \"mc1_stderr\": 0.015321821688476194,\n \"mc2\": 0.4705381335286149,\n\
\ \"mc2_stderr\": 0.015491012979962984\n }\n}\n```"
repo_url: https://huggingface.co/Deci/DeciCoder-1b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|arc:challenge|25_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hellaswag|10_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T18:42:07.989702.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T18:42:07.989702.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T18:42:07.989702.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T18:42:07.989702.parquet'
- config_name: results
data_files:
- split: 2023_08_25T18_42_07.989702
path:
- results_2023-08-25T18:42:07.989702.parquet
- split: latest
path:
- results_2023-08-25T18:42:07.989702.parquet
---
# Dataset Card for Evaluation run of Deci/DeciCoder-1b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Deci/DeciCoder-1b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Deci/DeciCoder-1b](https://huggingface.co/Deci/DeciCoder-1b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Deci__DeciCoder-1b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-25T18:42:07.989702](https://huggingface.co/datasets/open-llm-leaderboard/details_Deci__DeciCoder-1b/blob/main/results_2023-08-25T18%3A42%3A07.989702.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24270247836868492,
"acc_stderr": 0.03126714039709406,
"acc_norm": 0.24404952511664843,
"acc_norm_stderr": 0.03128980515853274,
"mc1": 0.2582619339045288,
"mc1_stderr": 0.015321821688476194,
"mc2": 0.4705381335286149,
"mc2_stderr": 0.015491012979962984
},
"harness|arc:challenge|25": {
"acc": 0.16040955631399317,
"acc_stderr": 0.010724336059110964,
"acc_norm": 0.21160409556313994,
"acc_norm_stderr": 0.011935916358632875
},
"harness|hellaswag|10": {
"acc": 0.2826130252937662,
"acc_stderr": 0.004493495872000129,
"acc_norm": 0.31089424417446726,
"acc_norm_stderr": 0.004619136497359843
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.03785714465066654,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.03785714465066654
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.035834961763610645,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.035834961763610645
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2,
"acc_stderr": 0.024618298195866518,
"acc_norm": 0.2,
"acc_norm_stderr": 0.024618298195866518
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366255,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366255
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.03186209851641144,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.03186209851641144
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.28085106382978725,
"acc_stderr": 0.02937917046412482,
"acc_norm": 0.28085106382978725,
"acc_norm_stderr": 0.02937917046412482
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.20175438596491227,
"acc_stderr": 0.037752050135836386,
"acc_norm": 0.20175438596491227,
"acc_norm_stderr": 0.037752050135836386
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.22758620689655173,
"acc_stderr": 0.03493950380131184,
"acc_norm": 0.22758620689655173,
"acc_norm_stderr": 0.03493950380131184
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2328042328042328,
"acc_stderr": 0.02176596167215453,
"acc_norm": 0.2328042328042328,
"acc_norm_stderr": 0.02176596167215453
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.03970158273235173,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.03970158273235173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.17419354838709677,
"acc_stderr": 0.021576248184514583,
"acc_norm": 0.17419354838709677,
"acc_norm_stderr": 0.021576248184514583
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.18719211822660098,
"acc_stderr": 0.027444924966882618,
"acc_norm": 0.18719211822660098,
"acc_norm_stderr": 0.027444924966882618
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23030303030303031,
"acc_stderr": 0.03287666758603488,
"acc_norm": 0.23030303030303031,
"acc_norm_stderr": 0.03287666758603488
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.20707070707070707,
"acc_stderr": 0.02886977846026705,
"acc_norm": 0.20707070707070707,
"acc_norm_stderr": 0.02886977846026705
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.20725388601036268,
"acc_stderr": 0.029252823291803617,
"acc_norm": 0.20725388601036268,
"acc_norm_stderr": 0.029252823291803617
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.19487179487179487,
"acc_stderr": 0.02008316759518139,
"acc_norm": 0.19487179487179487,
"acc_norm_stderr": 0.02008316759518139
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.026202766534652148,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.026202766534652148
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.19327731092436976,
"acc_stderr": 0.02564947026588919,
"acc_norm": 0.19327731092436976,
"acc_norm_stderr": 0.02564947026588919
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.03603038545360384,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.03603038545360384
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1908256880733945,
"acc_stderr": 0.016847676400091112,
"acc_norm": 0.1908256880733945,
"acc_norm_stderr": 0.016847676400091112
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1712962962962963,
"acc_stderr": 0.025695341643824688,
"acc_norm": 0.1712962962962963,
"acc_norm_stderr": 0.025695341643824688
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.030778554678693268,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.030778554678693268
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3094170403587444,
"acc_stderr": 0.031024411740572206,
"acc_norm": 0.3094170403587444,
"acc_norm_stderr": 0.031024411740572206
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2824427480916031,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.2824427480916031,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.256198347107438,
"acc_stderr": 0.03984979653302871,
"acc_norm": 0.256198347107438,
"acc_norm_stderr": 0.03984979653302871
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.26380368098159507,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.26380368098159507,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25892857142857145,
"acc_stderr": 0.041577515398656284,
"acc_norm": 0.25892857142857145,
"acc_norm_stderr": 0.041577515398656284
},
"harness|hendrycksTest-management|5": {
"acc": 0.18446601941747573,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.18446601941747573,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.28205128205128205,
"acc_stderr": 0.02948036054954119,
"acc_norm": 0.28205128205128205,
"acc_norm_stderr": 0.02948036054954119
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2796934865900383,
"acc_stderr": 0.01605079214803655,
"acc_norm": 0.2796934865900383,
"acc_norm_stderr": 0.01605079214803655
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2398843930635838,
"acc_stderr": 0.022989592543123567,
"acc_norm": 0.2398843930635838,
"acc_norm_stderr": 0.022989592543123567
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.21895424836601307,
"acc_stderr": 0.02367908986180772,
"acc_norm": 0.21895424836601307,
"acc_norm_stderr": 0.02367908986180772
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2829581993569132,
"acc_stderr": 0.025583062489984827,
"acc_norm": 0.2829581993569132,
"acc_norm_stderr": 0.025583062489984827
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.24382716049382716,
"acc_stderr": 0.023891879541959614,
"acc_norm": 0.24382716049382716,
"acc_norm_stderr": 0.023891879541959614
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25177304964539005,
"acc_stderr": 0.025892151156709405,
"acc_norm": 0.25177304964539005,
"acc_norm_stderr": 0.025892151156709405
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24185136897001303,
"acc_stderr": 0.010936550813827066,
"acc_norm": 0.24185136897001303,
"acc_norm_stderr": 0.010936550813827066
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.17279411764705882,
"acc_stderr": 0.022966067585581788,
"acc_norm": 0.17279411764705882,
"acc_norm_stderr": 0.022966067585581788
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2826797385620915,
"acc_stderr": 0.018217269552053435,
"acc_norm": 0.2826797385620915,
"acc_norm_stderr": 0.018217269552053435
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.24545454545454545,
"acc_stderr": 0.04122066502878284,
"acc_norm": 0.24545454545454545,
"acc_norm_stderr": 0.04122066502878284
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.19591836734693877,
"acc_stderr": 0.025409301953225678,
"acc_norm": 0.19591836734693877,
"acc_norm_stderr": 0.025409301953225678
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.25870646766169153,
"acc_stderr": 0.030965903123573012,
"acc_norm": 0.25870646766169153,
"acc_norm_stderr": 0.030965903123573012
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-virology|5": {
"acc": 0.25301204819277107,
"acc_stderr": 0.03384429155233135,
"acc_norm": 0.25301204819277107,
"acc_norm_stderr": 0.03384429155233135
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.03508771929824563,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.03508771929824563
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2582619339045288,
"mc1_stderr": 0.015321821688476194,
"mc2": 0.4705381335286149,
"mc2_stderr": 0.015491012979962984
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
OneFly7/llama2-SST2-double-end-token | 2023-08-25T18:55:26.000Z | [
"region:us"
] | OneFly7 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: label_text
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 8587845
num_examples: 67349
- name: validation
num_bytes: 142004
num_examples: 872
download_size: 3308564
dataset_size: 8729849
---
# Dataset Card for "llama2-SST2-double-end-token"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
longevity-genie/biolinkbert_large_512_aging_papers_paragraphs | 2023-08-25T19:44:32.000Z | [
"license:openrail",
"region:us"
] | longevity-genie | null | null | null | 0 | 0 | ---
license: openrail
---
|
claudios/java-trace-dataset | 2023-08-25T19:42:15.000Z | [
"license:cc-by-sa-3.0",
"doi:10.57967/hf/1020",
"region:us"
] | claudios | null | null | null | 0 | 0 | ---
license: cc-by-sa-3.0
dataset_info:
features:
- name: project
dtype: string
- name: test_suite
dtype: string
- name: index_in_dump
dtype: int64
- name: class_name
dtype: string
- name: method_name
dtype: string
- name: just_class_name
dtype: string
- name: just_method_name
dtype: string
- name: anonymous_classes
dtype: string
- name: anonymous_methods
dtype: string
- name: source_code
dtype: string
- name: notes
dtype: string
- name: java_calls
dtype: string
- name: calls_with_boundaries
dtype: string
- name: java_call_count
dtype: int64
- name: max_depth
dtype: int64
- name: loc_tuple
sequence: int64
- name: id
dtype: int64
splits:
- name: train
num_bytes: 23074236652
num_examples: 438224
- name: validation
num_bytes: 549889066
num_examples: 10000
- name: with_libraries
num_bytes: 550545938.8488278
num_examples: 10000
- name: without_libraries
num_bytes: 235199288
num_examples: 10000
download_size: 63530054
dataset_size: 24409870944.848827
---
|
ftresgallo/aws-cur-query | 2023-08-25T19:27:35.000Z | [
"region:us"
] | ftresgallo | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_TheBloke__CodeLlama-13B-Python-fp16 | 2023-08-27T12:42:39.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TheBloke/CodeLlama-13B-Python-fp16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/CodeLlama-13B-Python-fp16](https://huggingface.co/TheBloke/CodeLlama-13B-Python-fp16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__CodeLlama-13B-Python-fp16\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-25T19:26:38.056569](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__CodeLlama-13B-Python-fp16/blob/main/results_2023-08-25T19%3A26%3A38.056569.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26171872838516447,\n\
\ \"acc_stderr\": 0.03167776466143373,\n \"acc_norm\": 0.2638063449619791,\n\
\ \"acc_norm_stderr\": 0.03168786938490037,\n \"mc1\": 0.26560587515299877,\n\
\ \"mc1_stderr\": 0.015461027627253595,\n \"mc2\": 0.43989219144943836,\n\
\ \"mc2_stderr\": 0.014690020723528612\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2960750853242321,\n \"acc_stderr\": 0.013340916085246252,\n\
\ \"acc_norm\": 0.3319112627986348,\n \"acc_norm_stderr\": 0.013760988200880543\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.35769766978689504,\n\
\ \"acc_stderr\": 0.0047834288742735764,\n \"acc_norm\": 0.44503087034455285,\n\
\ \"acc_norm_stderr\": 0.0049595354431706175\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816507,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816507\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.20394736842105263,\n \"acc_stderr\": 0.03279000406310052,\n\
\ \"acc_norm\": 0.20394736842105263,\n \"acc_norm_stderr\": 0.03279000406310052\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2188679245283019,\n \"acc_stderr\": 0.02544786382510861,\n\
\ \"acc_norm\": 0.2188679245283019,\n \"acc_norm_stderr\": 0.02544786382510861\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2916666666666667,\n\
\ \"acc_stderr\": 0.038009680605548574,\n \"acc_norm\": 0.2916666666666667,\n\
\ \"acc_norm_stderr\": 0.038009680605548574\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536955\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\"\
: 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.26011560693641617,\n\
\ \"acc_stderr\": 0.03345036916788992,\n \"acc_norm\": 0.26011560693641617,\n\
\ \"acc_norm_stderr\": 0.03345036916788992\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.16666666666666666,\n \"acc_stderr\": 0.037082846624165444,\n\
\ \"acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.037082846624165444\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.21,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.21,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2680851063829787,\n \"acc_stderr\": 0.028957342788342347,\n\
\ \"acc_norm\": 0.2680851063829787,\n \"acc_norm_stderr\": 0.028957342788342347\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2689655172413793,\n \"acc_stderr\": 0.03695183311650232,\n\
\ \"acc_norm\": 0.2689655172413793,\n \"acc_norm_stderr\": 0.03695183311650232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1746031746031746,\n\
\ \"acc_stderr\": 0.03395490020856112,\n \"acc_norm\": 0.1746031746031746,\n\
\ \"acc_norm_stderr\": 0.03395490020856112\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.31290322580645163,\n \"acc_stderr\": 0.026377567028645858,\n \"\
acc_norm\": 0.31290322580645163,\n \"acc_norm_stderr\": 0.026377567028645858\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.270935960591133,\n \"acc_stderr\": 0.03127090713297698,\n \"acc_norm\"\
: 0.270935960591133,\n \"acc_norm_stderr\": 0.03127090713297698\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2606060606060606,\n \"acc_stderr\": 0.03427743175816524,\n\
\ \"acc_norm\": 0.2606060606060606,\n \"acc_norm_stderr\": 0.03427743175816524\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2828282828282828,\n \"acc_stderr\": 0.0320877955878675,\n \"acc_norm\"\
: 0.2828282828282828,\n \"acc_norm_stderr\": 0.0320877955878675\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.35233160621761656,\n \"acc_stderr\": 0.03447478286414359,\n\
\ \"acc_norm\": 0.35233160621761656,\n \"acc_norm_stderr\": 0.03447478286414359\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3282051282051282,\n \"acc_stderr\": 0.02380763319865726,\n \
\ \"acc_norm\": 0.3282051282051282,\n \"acc_norm_stderr\": 0.02380763319865726\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2605042016806723,\n \"acc_stderr\": 0.028510251512341937,\n\
\ \"acc_norm\": 0.2605042016806723,\n \"acc_norm_stderr\": 0.028510251512341937\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.28990825688073396,\n \"acc_stderr\": 0.0194530666092016,\n \"\
acc_norm\": 0.28990825688073396,\n \"acc_norm_stderr\": 0.0194530666092016\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.39814814814814814,\n \"acc_stderr\": 0.033384734032074016,\n \"\
acc_norm\": 0.39814814814814814,\n \"acc_norm_stderr\": 0.033384734032074016\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25980392156862747,\n \"acc_stderr\": 0.03077855467869326,\n \"\
acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.03077855467869326\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.29957805907172996,\n \"acc_stderr\": 0.029818024749753095,\n \
\ \"acc_norm\": 0.29957805907172996,\n \"acc_norm_stderr\": 0.029818024749753095\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.242152466367713,\n\
\ \"acc_stderr\": 0.028751392398694755,\n \"acc_norm\": 0.242152466367713,\n\
\ \"acc_norm_stderr\": 0.028751392398694755\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.29770992366412213,\n \"acc_stderr\": 0.04010358942462202,\n\
\ \"acc_norm\": 0.29770992366412213,\n \"acc_norm_stderr\": 0.04010358942462202\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.256198347107438,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.256198347107438,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.24074074074074073,\n\
\ \"acc_stderr\": 0.041331194402438376,\n \"acc_norm\": 0.24074074074074073,\n\
\ \"acc_norm_stderr\": 0.041331194402438376\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.03259177392742177,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.03259177392742177\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.23214285714285715,\n\
\ \"acc_stderr\": 0.04007341809755806,\n \"acc_norm\": 0.23214285714285715,\n\
\ \"acc_norm_stderr\": 0.04007341809755806\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.0376017800602662,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.0376017800602662\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.1752136752136752,\n\
\ \"acc_stderr\": 0.024904439098918214,\n \"acc_norm\": 0.1752136752136752,\n\
\ \"acc_norm_stderr\": 0.024904439098918214\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.31800766283524906,\n\
\ \"acc_stderr\": 0.016653486275615394,\n \"acc_norm\": 0.31800766283524906,\n\
\ \"acc_norm_stderr\": 0.016653486275615394\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.021855255263421795,\n\
\ \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.021855255263421795\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27150837988826815,\n\
\ \"acc_stderr\": 0.014874252168095261,\n \"acc_norm\": 0.27150837988826815,\n\
\ \"acc_norm_stderr\": 0.014874252168095261\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2581699346405229,\n \"acc_stderr\": 0.025058503316958147,\n\
\ \"acc_norm\": 0.2581699346405229,\n \"acc_norm_stderr\": 0.025058503316958147\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.24758842443729903,\n\
\ \"acc_stderr\": 0.024513879973621967,\n \"acc_norm\": 0.24758842443729903,\n\
\ \"acc_norm_stderr\": 0.024513879973621967\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.22839506172839505,\n \"acc_stderr\": 0.023358211840626267,\n\
\ \"acc_norm\": 0.22839506172839505,\n \"acc_norm_stderr\": 0.023358211840626267\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.28368794326241137,\n \"acc_stderr\": 0.026891709428343957,\n \
\ \"acc_norm\": 0.28368794326241137,\n \"acc_norm_stderr\": 0.026891709428343957\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2438070404172099,\n\
\ \"acc_stderr\": 0.010966507972178475,\n \"acc_norm\": 0.2438070404172099,\n\
\ \"acc_norm_stderr\": 0.010966507972178475\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.21691176470588236,\n \"acc_stderr\": 0.025035845227711233,\n\
\ \"acc_norm\": 0.21691176470588236,\n \"acc_norm_stderr\": 0.025035845227711233\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.24673202614379086,\n \"acc_stderr\": 0.0174408203674025,\n \
\ \"acc_norm\": 0.24673202614379086,\n \"acc_norm_stderr\": 0.0174408203674025\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n\
\ \"acc_stderr\": 0.038950910157241364,\n \"acc_norm\": 0.20909090909090908,\n\
\ \"acc_norm_stderr\": 0.038950910157241364\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3877551020408163,\n \"acc_stderr\": 0.031192230726795656,\n\
\ \"acc_norm\": 0.3877551020408163,\n \"acc_norm_stderr\": 0.031192230726795656\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.21393034825870647,\n\
\ \"acc_stderr\": 0.028996909693328923,\n \"acc_norm\": 0.21393034825870647,\n\
\ \"acc_norm_stderr\": 0.028996909693328923\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.21686746987951808,\n\
\ \"acc_stderr\": 0.03208284450356365,\n \"acc_norm\": 0.21686746987951808,\n\
\ \"acc_norm_stderr\": 0.03208284450356365\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.035650796707083106,\n\
\ \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.035650796707083106\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26560587515299877,\n\
\ \"mc1_stderr\": 0.015461027627253595,\n \"mc2\": 0.43989219144943836,\n\
\ \"mc2_stderr\": 0.014690020723528612\n }\n}\n```"
repo_url: https://huggingface.co/TheBloke/CodeLlama-13B-Python-fp16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|arc:challenge|25_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hellaswag|10_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T19:26:38.056569.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T19:26:38.056569.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T19:26:38.056569.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T19:26:38.056569.parquet'
- config_name: results
data_files:
- split: 2023_08_25T19_26_38.056569
path:
- results_2023-08-25T19:26:38.056569.parquet
- split: latest
path:
- results_2023-08-25T19:26:38.056569.parquet
---
# Dataset Card for Evaluation run of TheBloke/CodeLlama-13B-Python-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/CodeLlama-13B-Python-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/CodeLlama-13B-Python-fp16](https://huggingface.co/TheBloke/CodeLlama-13B-Python-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__CodeLlama-13B-Python-fp16",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-25T19:26:38.056569](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__CodeLlama-13B-Python-fp16/blob/main/results_2023-08-25T19%3A26%3A38.056569.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26171872838516447,
"acc_stderr": 0.03167776466143373,
"acc_norm": 0.2638063449619791,
"acc_norm_stderr": 0.03168786938490037,
"mc1": 0.26560587515299877,
"mc1_stderr": 0.015461027627253595,
"mc2": 0.43989219144943836,
"mc2_stderr": 0.014690020723528612
},
"harness|arc:challenge|25": {
"acc": 0.2960750853242321,
"acc_stderr": 0.013340916085246252,
"acc_norm": 0.3319112627986348,
"acc_norm_stderr": 0.013760988200880543
},
"harness|hellaswag|10": {
"acc": 0.35769766978689504,
"acc_stderr": 0.0047834288742735764,
"acc_norm": 0.44503087034455285,
"acc_norm_stderr": 0.0049595354431706175
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.20394736842105263,
"acc_stderr": 0.03279000406310052,
"acc_norm": 0.20394736842105263,
"acc_norm_stderr": 0.03279000406310052
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2188679245283019,
"acc_stderr": 0.02544786382510861,
"acc_norm": 0.2188679245283019,
"acc_norm_stderr": 0.02544786382510861
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2916666666666667,
"acc_stderr": 0.038009680605548574,
"acc_norm": 0.2916666666666667,
"acc_norm_stderr": 0.038009680605548574
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.26011560693641617,
"acc_stderr": 0.03345036916788992,
"acc_norm": 0.26011560693641617,
"acc_norm_stderr": 0.03345036916788992
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.037082846624165444,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.037082846624165444
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2680851063829787,
"acc_stderr": 0.028957342788342347,
"acc_norm": 0.2680851063829787,
"acc_norm_stderr": 0.028957342788342347
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2689655172413793,
"acc_stderr": 0.03695183311650232,
"acc_norm": 0.2689655172413793,
"acc_norm_stderr": 0.03695183311650232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1746031746031746,
"acc_stderr": 0.03395490020856112,
"acc_norm": 0.1746031746031746,
"acc_norm_stderr": 0.03395490020856112
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.31290322580645163,
"acc_stderr": 0.026377567028645858,
"acc_norm": 0.31290322580645163,
"acc_norm_stderr": 0.026377567028645858
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.270935960591133,
"acc_stderr": 0.03127090713297698,
"acc_norm": 0.270935960591133,
"acc_norm_stderr": 0.03127090713297698
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2606060606060606,
"acc_stderr": 0.03427743175816524,
"acc_norm": 0.2606060606060606,
"acc_norm_stderr": 0.03427743175816524
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2828282828282828,
"acc_stderr": 0.0320877955878675,
"acc_norm": 0.2828282828282828,
"acc_norm_stderr": 0.0320877955878675
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.35233160621761656,
"acc_stderr": 0.03447478286414359,
"acc_norm": 0.35233160621761656,
"acc_norm_stderr": 0.03447478286414359
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3282051282051282,
"acc_stderr": 0.02380763319865726,
"acc_norm": 0.3282051282051282,
"acc_norm_stderr": 0.02380763319865726
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833706,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833706
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2605042016806723,
"acc_stderr": 0.028510251512341937,
"acc_norm": 0.2605042016806723,
"acc_norm_stderr": 0.028510251512341937
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.28990825688073396,
"acc_stderr": 0.0194530666092016,
"acc_norm": 0.28990825688073396,
"acc_norm_stderr": 0.0194530666092016
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39814814814814814,
"acc_stderr": 0.033384734032074016,
"acc_norm": 0.39814814814814814,
"acc_norm_stderr": 0.033384734032074016
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.03077855467869326,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.03077855467869326
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.29957805907172996,
"acc_stderr": 0.029818024749753095,
"acc_norm": 0.29957805907172996,
"acc_norm_stderr": 0.029818024749753095
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.242152466367713,
"acc_stderr": 0.028751392398694755,
"acc_norm": 0.242152466367713,
"acc_norm_stderr": 0.028751392398694755
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.29770992366412213,
"acc_stderr": 0.04010358942462202,
"acc_norm": 0.29770992366412213,
"acc_norm_stderr": 0.04010358942462202
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.256198347107438,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.256198347107438,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.041331194402438376,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.041331194402438376
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.03259177392742177,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.03259177392742177
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.23214285714285715,
"acc_stderr": 0.04007341809755806,
"acc_norm": 0.23214285714285715,
"acc_norm_stderr": 0.04007341809755806
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.0376017800602662,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.0376017800602662
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.1752136752136752,
"acc_stderr": 0.024904439098918214,
"acc_norm": 0.1752136752136752,
"acc_norm_stderr": 0.024904439098918214
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.31800766283524906,
"acc_stderr": 0.016653486275615394,
"acc_norm": 0.31800766283524906,
"acc_norm_stderr": 0.016653486275615394
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.021855255263421795,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.021855255263421795
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27150837988826815,
"acc_stderr": 0.014874252168095261,
"acc_norm": 0.27150837988826815,
"acc_norm_stderr": 0.014874252168095261
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2581699346405229,
"acc_stderr": 0.025058503316958147,
"acc_norm": 0.2581699346405229,
"acc_norm_stderr": 0.025058503316958147
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.24758842443729903,
"acc_stderr": 0.024513879973621967,
"acc_norm": 0.24758842443729903,
"acc_norm_stderr": 0.024513879973621967
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.22839506172839505,
"acc_stderr": 0.023358211840626267,
"acc_norm": 0.22839506172839505,
"acc_norm_stderr": 0.023358211840626267
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.28368794326241137,
"acc_stderr": 0.026891709428343957,
"acc_norm": 0.28368794326241137,
"acc_norm_stderr": 0.026891709428343957
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2438070404172099,
"acc_stderr": 0.010966507972178475,
"acc_norm": 0.2438070404172099,
"acc_norm_stderr": 0.010966507972178475
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.21691176470588236,
"acc_stderr": 0.025035845227711233,
"acc_norm": 0.21691176470588236,
"acc_norm_stderr": 0.025035845227711233
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24673202614379086,
"acc_stderr": 0.0174408203674025,
"acc_norm": 0.24673202614379086,
"acc_norm_stderr": 0.0174408203674025
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.20909090909090908,
"acc_stderr": 0.038950910157241364,
"acc_norm": 0.20909090909090908,
"acc_norm_stderr": 0.038950910157241364
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3877551020408163,
"acc_stderr": 0.031192230726795656,
"acc_norm": 0.3877551020408163,
"acc_norm_stderr": 0.031192230726795656
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.21393034825870647,
"acc_stderr": 0.028996909693328923,
"acc_norm": 0.21393034825870647,
"acc_norm_stderr": 0.028996909693328923
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-virology|5": {
"acc": 0.21686746987951808,
"acc_stderr": 0.03208284450356365,
"acc_norm": 0.21686746987951808,
"acc_norm_stderr": 0.03208284450356365
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.035650796707083106,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.035650796707083106
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26560587515299877,
"mc1_stderr": 0.015461027627253595,
"mc2": 0.43989219144943836,
"mc2_stderr": 0.014690020723528612
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
RazinAleks/SO-Python_QA-DS_ML_summ_class | 2023-08-25T19:28:46.000Z | [
"region:us"
] | RazinAleks | null | null | null | 0 | 0 | Entry not found |
Szadess/Gelussw | 2023-08-25T19:29:43.000Z | [
"region:us"
] | Szadess | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_clibrain__Llama-2-ft-instruct-es | 2023-09-17T17:59:14.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of clibrain/Llama-2-ft-instruct-es
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [clibrain/Llama-2-ft-instruct-es](https://huggingface.co/clibrain/Llama-2-ft-instruct-es)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_clibrain__Llama-2-ft-instruct-es\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T17:59:02.863865](https://huggingface.co/datasets/open-llm-leaderboard/details_clibrain__Llama-2-ft-instruct-es/blob/main/results_2023-09-17T17-59-02.863865.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0,\n \"\
em_stderr\": 0.0,\n \"f1\": 0.0,\n \"f1_stderr\": 0.0,\n \"\
acc\": 0.2478295185477506,\n \"acc_stderr\": 0.007025978032038456\n },\n\
\ \"harness|drop|3\": {\n \"em\": 0.0,\n \"em_stderr\": 0.0,\n\
\ \"f1\": 0.0,\n \"f1_stderr\": 0.0\n },\n \"harness|gsm8k|5\"\
: {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.4956590370955012,\n \"acc_stderr\": 0.014051956064076911\n\
\ }\n}\n```"
repo_url: https://huggingface.co/clibrain/Llama-2-ft-instruct-es
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|arc:challenge|25_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T17_59_02.863865
path:
- '**/details_harness|drop|3_2023-09-17T17-59-02.863865.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T17-59-02.863865.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T17_59_02.863865
path:
- '**/details_harness|gsm8k|5_2023-09-17T17-59-02.863865.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T17-59-02.863865.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hellaswag|10_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T19:36:08.180753.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T19:36:08.180753.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T19:36:08.180753.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T17_59_02.863865
path:
- '**/details_harness|winogrande|5_2023-09-17T17-59-02.863865.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T17-59-02.863865.parquet'
- config_name: results
data_files:
- split: 2023_08_25T19_36_08.180753
path:
- results_2023-08-25T19:36:08.180753.parquet
- split: 2023_09_17T17_59_02.863865
path:
- results_2023-09-17T17-59-02.863865.parquet
- split: latest
path:
- results_2023-09-17T17-59-02.863865.parquet
---
# Dataset Card for Evaluation run of clibrain/Llama-2-ft-instruct-es
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/clibrain/Llama-2-ft-instruct-es
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [clibrain/Llama-2-ft-instruct-es](https://huggingface.co/clibrain/Llama-2-ft-instruct-es) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_clibrain__Llama-2-ft-instruct-es",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T17:59:02.863865](https://huggingface.co/datasets/open-llm-leaderboard/details_clibrain__Llama-2-ft-instruct-es/blob/main/results_2023-09-17T17-59-02.863865.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0,
"em_stderr": 0.0,
"f1": 0.0,
"f1_stderr": 0.0,
"acc": 0.2478295185477506,
"acc_stderr": 0.007025978032038456
},
"harness|drop|3": {
"em": 0.0,
"em_stderr": 0.0,
"f1": 0.0,
"f1_stderr": 0.0
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.4956590370955012,
"acc_stderr": 0.014051956064076911
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
nc33/qapair | 2023-08-25T19:42:06.000Z | [
"region:us"
] | nc33 | null | null | null | 0 | 0 | ---
dataset_info:
config_name: qna
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: full_answer
dtype: string
- name: FaQ
dtype: string
splits:
- name: train
num_bytes: 2500462492
num_examples: 449423
download_size: 450150169
dataset_size: 2500462492
configs:
- config_name: qna
data_files:
- split: train
path: qna/train-*
---
# Dataset Card for "qapair"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
OneFly7/llama2-SST2-no-template | 2023-08-25T19:42:24.000Z | [
"region:us"
] | OneFly7 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: label_text
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 6769422
num_examples: 67349
- name: validation
num_bytes: 126308
num_examples: 872
download_size: 3215289
dataset_size: 6895730
---
# Dataset Card for "llama2-SST2-no-template"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yzhuang/autotree_automl_bank-marketing_gosdt_l512_d3_sd1 | 2023-08-25T19:48:23.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 5538400000
num_examples: 100000
- name: validation
num_bytes: 553840000
num_examples: 10000
download_size: 811185053
dataset_size: 6092240000
---
# Dataset Card for "autotree_automl_bank-marketing_gosdt_l512_d3_sd1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yzhuang/autotree_automl_bank-marketing_gosdt_l512_d3_sd2 | 2023-08-25T20:11:52.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 5538400000
num_examples: 100000
- name: validation
num_bytes: 553840000
num_examples: 10000
download_size: 809008458
dataset_size: 6092240000
---
# Dataset Card for "autotree_automl_bank-marketing_gosdt_l512_d3_sd2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dfluechter/datas | 2023-08-25T20:16:53.000Z | [
"region:us"
] | dfluechter | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_OpenAssistant__galactica-6.7b-finetuned | 2023-08-27T12:42:43.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of OpenAssistant/galactica-6.7b-finetuned
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [OpenAssistant/galactica-6.7b-finetuned](https://huggingface.co/OpenAssistant/galactica-6.7b-finetuned)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenAssistant__galactica-6.7b-finetuned\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-25T20:22:41.470589](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenAssistant__galactica-6.7b-finetuned/blob/main/results_2023-08-25T20%3A22%3A41.470589.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3806259182452288,\n\
\ \"acc_stderr\": 0.03509900480322365,\n \"acc_norm\": 0.3831246357563999,\n\
\ \"acc_norm_stderr\": 0.035104541072187884,\n \"mc1\": 0.25703794369645044,\n\
\ \"mc1_stderr\": 0.01529807750948508,\n \"mc2\": 0.4164599352040314,\n\
\ \"mc2_stderr\": 0.014438880882779618\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.378839590443686,\n \"acc_stderr\": 0.01417591549000032,\n\
\ \"acc_norm\": 0.41552901023890787,\n \"acc_norm_stderr\": 0.01440136664121639\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.39932284405496915,\n\
\ \"acc_stderr\": 0.004887583074180844,\n \"acc_norm\": 0.5100577574188409,\n\
\ \"acc_norm_stderr\": 0.004988771791854516\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.40789473684210525,\n \"acc_stderr\": 0.03999309712777472,\n\
\ \"acc_norm\": 0.40789473684210525,\n \"acc_norm_stderr\": 0.03999309712777472\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.32,\n\
\ \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.44150943396226416,\n \"acc_stderr\": 0.03056159042673184,\n\
\ \"acc_norm\": 0.44150943396226416,\n \"acc_norm_stderr\": 0.03056159042673184\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4583333333333333,\n\
\ \"acc_stderr\": 0.04166666666666666,\n \"acc_norm\": 0.4583333333333333,\n\
\ \"acc_norm_stderr\": 0.04166666666666666\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.36416184971098264,\n\
\ \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.36416184971098264,\n\
\ \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237655,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237655\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3276595744680851,\n \"acc_stderr\": 0.030683020843231004,\n\
\ \"acc_norm\": 0.3276595744680851,\n \"acc_norm_stderr\": 0.030683020843231004\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.41379310344827586,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2777777777777778,\n \"acc_stderr\": 0.02306818884826112,\n \"\
acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.02306818884826112\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n\
\ \"acc_stderr\": 0.03970158273235172,\n \"acc_norm\": 0.2698412698412698,\n\
\ \"acc_norm_stderr\": 0.03970158273235172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4806451612903226,\n\
\ \"acc_stderr\": 0.028422687404312107,\n \"acc_norm\": 0.4806451612903226,\n\
\ \"acc_norm_stderr\": 0.028422687404312107\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.32019704433497537,\n \"acc_stderr\": 0.0328264938530415,\n\
\ \"acc_norm\": 0.32019704433497537,\n \"acc_norm_stderr\": 0.0328264938530415\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\"\
: 0.41,\n \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.42424242424242425,\n \"acc_stderr\": 0.038592681420702615,\n\
\ \"acc_norm\": 0.42424242424242425,\n \"acc_norm_stderr\": 0.038592681420702615\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4696969696969697,\n \"acc_stderr\": 0.03555804051763929,\n \"\
acc_norm\": 0.4696969696969697,\n \"acc_norm_stderr\": 0.03555804051763929\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.47668393782383417,\n \"acc_stderr\": 0.03604513672442205,\n\
\ \"acc_norm\": 0.47668393782383417,\n \"acc_norm_stderr\": 0.03604513672442205\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.37435897435897436,\n \"acc_stderr\": 0.024537591572830513,\n\
\ \"acc_norm\": 0.37435897435897436,\n \"acc_norm_stderr\": 0.024537591572830513\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145668,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145668\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3697478991596639,\n \"acc_stderr\": 0.031357095996135904,\n\
\ \"acc_norm\": 0.3697478991596639,\n \"acc_norm_stderr\": 0.031357095996135904\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.47706422018348627,\n\
\ \"acc_stderr\": 0.021414757058175506,\n \"acc_norm\": 0.47706422018348627,\n\
\ \"acc_norm_stderr\": 0.021414757058175506\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.36574074074074076,\n \"acc_stderr\": 0.03284738857647207,\n\
\ \"acc_norm\": 0.36574074074074076,\n \"acc_norm_stderr\": 0.03284738857647207\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.3137254901960784,\n \"acc_stderr\": 0.032566854844603886,\n \"\
acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.032566854844603886\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.4345991561181435,\n \"acc_stderr\": 0.03226759995510145,\n \
\ \"acc_norm\": 0.4345991561181435,\n \"acc_norm_stderr\": 0.03226759995510145\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.40358744394618834,\n\
\ \"acc_stderr\": 0.032928028193303135,\n \"acc_norm\": 0.40358744394618834,\n\
\ \"acc_norm_stderr\": 0.032928028193303135\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.42748091603053434,\n \"acc_stderr\": 0.043389203057924,\n\
\ \"acc_norm\": 0.42748091603053434,\n \"acc_norm_stderr\": 0.043389203057924\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.4793388429752066,\n \"acc_stderr\": 0.04560456086387235,\n \"\
acc_norm\": 0.4793388429752066,\n \"acc_norm_stderr\": 0.04560456086387235\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4074074074074074,\n\
\ \"acc_stderr\": 0.04750077341199984,\n \"acc_norm\": 0.4074074074074074,\n\
\ \"acc_norm_stderr\": 0.04750077341199984\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3619631901840491,\n \"acc_stderr\": 0.037757007291414416,\n\
\ \"acc_norm\": 0.3619631901840491,\n \"acc_norm_stderr\": 0.037757007291414416\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.043270409325787296,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.043270409325787296\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.4368932038834951,\n \"acc_stderr\": 0.04911147107365777,\n\
\ \"acc_norm\": 0.4368932038834951,\n \"acc_norm_stderr\": 0.04911147107365777\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.32905982905982906,\n\
\ \"acc_stderr\": 0.030782321577688166,\n \"acc_norm\": 0.32905982905982906,\n\
\ \"acc_norm_stderr\": 0.030782321577688166\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4125159642401022,\n\
\ \"acc_stderr\": 0.01760414910867192,\n \"acc_norm\": 0.4125159642401022,\n\
\ \"acc_norm_stderr\": 0.01760414910867192\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.3872832369942196,\n \"acc_stderr\": 0.026226158605124655,\n\
\ \"acc_norm\": 0.3872832369942196,\n \"acc_norm_stderr\": 0.026226158605124655\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25921787709497207,\n\
\ \"acc_stderr\": 0.014655780837497743,\n \"acc_norm\": 0.25921787709497207,\n\
\ \"acc_norm_stderr\": 0.014655780837497743\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.434640522875817,\n \"acc_stderr\": 0.028384256704883034,\n\
\ \"acc_norm\": 0.434640522875817,\n \"acc_norm_stderr\": 0.028384256704883034\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.40514469453376206,\n\
\ \"acc_stderr\": 0.02788238379132595,\n \"acc_norm\": 0.40514469453376206,\n\
\ \"acc_norm_stderr\": 0.02788238379132595\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.404320987654321,\n \"acc_stderr\": 0.027306625297327684,\n\
\ \"acc_norm\": 0.404320987654321,\n \"acc_norm_stderr\": 0.027306625297327684\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2730496453900709,\n \"acc_stderr\": 0.026577860943307854,\n \
\ \"acc_norm\": 0.2730496453900709,\n \"acc_norm_stderr\": 0.026577860943307854\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3155149934810952,\n\
\ \"acc_stderr\": 0.011869184843058636,\n \"acc_norm\": 0.3155149934810952,\n\
\ \"acc_norm_stderr\": 0.011869184843058636\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.030161911930767102,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.030161911930767102\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.34477124183006536,\n \"acc_stderr\": 0.01922832201869664,\n \
\ \"acc_norm\": 0.34477124183006536,\n \"acc_norm_stderr\": 0.01922832201869664\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.37272727272727274,\n\
\ \"acc_stderr\": 0.04631381319425463,\n \"acc_norm\": 0.37272727272727274,\n\
\ \"acc_norm_stderr\": 0.04631381319425463\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.34285714285714286,\n \"acc_stderr\": 0.03038726291954773,\n\
\ \"acc_norm\": 0.34285714285714286,\n \"acc_norm_stderr\": 0.03038726291954773\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.43283582089552236,\n\
\ \"acc_stderr\": 0.03503490923673282,\n \"acc_norm\": 0.43283582089552236,\n\
\ \"acc_norm_stderr\": 0.03503490923673282\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n\
\ \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.4578313253012048,\n\
\ \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.26900584795321636,\n \"acc_stderr\": 0.03401052620104089,\n\
\ \"acc_norm\": 0.26900584795321636,\n \"acc_norm_stderr\": 0.03401052620104089\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25703794369645044,\n\
\ \"mc1_stderr\": 0.01529807750948508,\n \"mc2\": 0.4164599352040314,\n\
\ \"mc2_stderr\": 0.014438880882779618\n }\n}\n```"
repo_url: https://huggingface.co/OpenAssistant/galactica-6.7b-finetuned
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|arc:challenge|25_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hellaswag|10_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T20:22:41.470589.parquet'
- config_name: results
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- results_2023-08-25T20:22:41.470589.parquet
- split: latest
path:
- results_2023-08-25T20:22:41.470589.parquet
---
# Dataset Card for Evaluation run of OpenAssistant/galactica-6.7b-finetuned
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/OpenAssistant/galactica-6.7b-finetuned
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [OpenAssistant/galactica-6.7b-finetuned](https://huggingface.co/OpenAssistant/galactica-6.7b-finetuned) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OpenAssistant__galactica-6.7b-finetuned",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-25T20:22:41.470589](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenAssistant__galactica-6.7b-finetuned/blob/main/results_2023-08-25T20%3A22%3A41.470589.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3806259182452288,
"acc_stderr": 0.03509900480322365,
"acc_norm": 0.3831246357563999,
"acc_norm_stderr": 0.035104541072187884,
"mc1": 0.25703794369645044,
"mc1_stderr": 0.01529807750948508,
"mc2": 0.4164599352040314,
"mc2_stderr": 0.014438880882779618
},
"harness|arc:challenge|25": {
"acc": 0.378839590443686,
"acc_stderr": 0.01417591549000032,
"acc_norm": 0.41552901023890787,
"acc_norm_stderr": 0.01440136664121639
},
"harness|hellaswag|10": {
"acc": 0.39932284405496915,
"acc_stderr": 0.004887583074180844,
"acc_norm": 0.5100577574188409,
"acc_norm_stderr": 0.004988771791854516
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.40789473684210525,
"acc_stderr": 0.03999309712777472,
"acc_norm": 0.40789473684210525,
"acc_norm_stderr": 0.03999309712777472
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.44150943396226416,
"acc_stderr": 0.03056159042673184,
"acc_norm": 0.44150943396226416,
"acc_norm_stderr": 0.03056159042673184
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.04166666666666666,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.04166666666666666
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.36416184971098264,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.36416184971098264,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237655,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237655
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3276595744680851,
"acc_stderr": 0.030683020843231004,
"acc_norm": 0.3276595744680851,
"acc_norm_stderr": 0.030683020843231004
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.41379310344827586,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02306818884826112,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02306818884826112
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.03970158273235172,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.03970158273235172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4806451612903226,
"acc_stderr": 0.028422687404312107,
"acc_norm": 0.4806451612903226,
"acc_norm_stderr": 0.028422687404312107
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.32019704433497537,
"acc_stderr": 0.0328264938530415,
"acc_norm": 0.32019704433497537,
"acc_norm_stderr": 0.0328264938530415
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.42424242424242425,
"acc_stderr": 0.038592681420702615,
"acc_norm": 0.42424242424242425,
"acc_norm_stderr": 0.038592681420702615
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4696969696969697,
"acc_stderr": 0.03555804051763929,
"acc_norm": 0.4696969696969697,
"acc_norm_stderr": 0.03555804051763929
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.47668393782383417,
"acc_stderr": 0.03604513672442205,
"acc_norm": 0.47668393782383417,
"acc_norm_stderr": 0.03604513672442205
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.37435897435897436,
"acc_stderr": 0.024537591572830513,
"acc_norm": 0.37435897435897436,
"acc_norm_stderr": 0.024537591572830513
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.027080372815145668,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.027080372815145668
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3697478991596639,
"acc_stderr": 0.031357095996135904,
"acc_norm": 0.3697478991596639,
"acc_norm_stderr": 0.031357095996135904
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.47706422018348627,
"acc_stderr": 0.021414757058175506,
"acc_norm": 0.47706422018348627,
"acc_norm_stderr": 0.021414757058175506
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.36574074074074076,
"acc_stderr": 0.03284738857647207,
"acc_norm": 0.36574074074074076,
"acc_norm_stderr": 0.03284738857647207
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.032566854844603886,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.032566854844603886
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4345991561181435,
"acc_stderr": 0.03226759995510145,
"acc_norm": 0.4345991561181435,
"acc_norm_stderr": 0.03226759995510145
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.40358744394618834,
"acc_stderr": 0.032928028193303135,
"acc_norm": 0.40358744394618834,
"acc_norm_stderr": 0.032928028193303135
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.42748091603053434,
"acc_stderr": 0.043389203057924,
"acc_norm": 0.42748091603053434,
"acc_norm_stderr": 0.043389203057924
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.4793388429752066,
"acc_stderr": 0.04560456086387235,
"acc_norm": 0.4793388429752066,
"acc_norm_stderr": 0.04560456086387235
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.04750077341199984,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.04750077341199984
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3619631901840491,
"acc_stderr": 0.037757007291414416,
"acc_norm": 0.3619631901840491,
"acc_norm_stderr": 0.037757007291414416
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.043270409325787296,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.043270409325787296
},
"harness|hendrycksTest-management|5": {
"acc": 0.4368932038834951,
"acc_stderr": 0.04911147107365777,
"acc_norm": 0.4368932038834951,
"acc_norm_stderr": 0.04911147107365777
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.32905982905982906,
"acc_stderr": 0.030782321577688166,
"acc_norm": 0.32905982905982906,
"acc_norm_stderr": 0.030782321577688166
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.4125159642401022,
"acc_stderr": 0.01760414910867192,
"acc_norm": 0.4125159642401022,
"acc_norm_stderr": 0.01760414910867192
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3872832369942196,
"acc_stderr": 0.026226158605124655,
"acc_norm": 0.3872832369942196,
"acc_norm_stderr": 0.026226158605124655
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25921787709497207,
"acc_stderr": 0.014655780837497743,
"acc_norm": 0.25921787709497207,
"acc_norm_stderr": 0.014655780837497743
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.434640522875817,
"acc_stderr": 0.028384256704883034,
"acc_norm": 0.434640522875817,
"acc_norm_stderr": 0.028384256704883034
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.40514469453376206,
"acc_stderr": 0.02788238379132595,
"acc_norm": 0.40514469453376206,
"acc_norm_stderr": 0.02788238379132595
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.404320987654321,
"acc_stderr": 0.027306625297327684,
"acc_norm": 0.404320987654321,
"acc_norm_stderr": 0.027306625297327684
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2730496453900709,
"acc_stderr": 0.026577860943307854,
"acc_norm": 0.2730496453900709,
"acc_norm_stderr": 0.026577860943307854
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3155149934810952,
"acc_stderr": 0.011869184843058636,
"acc_norm": 0.3155149934810952,
"acc_norm_stderr": 0.011869184843058636
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.030161911930767102,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.030161911930767102
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.34477124183006536,
"acc_stderr": 0.01922832201869664,
"acc_norm": 0.34477124183006536,
"acc_norm_stderr": 0.01922832201869664
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.37272727272727274,
"acc_stderr": 0.04631381319425463,
"acc_norm": 0.37272727272727274,
"acc_norm_stderr": 0.04631381319425463
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.34285714285714286,
"acc_stderr": 0.03038726291954773,
"acc_norm": 0.34285714285714286,
"acc_norm_stderr": 0.03038726291954773
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.43283582089552236,
"acc_stderr": 0.03503490923673282,
"acc_norm": 0.43283582089552236,
"acc_norm_stderr": 0.03503490923673282
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.26900584795321636,
"acc_stderr": 0.03401052620104089,
"acc_norm": 0.26900584795321636,
"acc_norm_stderr": 0.03401052620104089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25703794369645044,
"mc1_stderr": 0.01529807750948508,
"mc2": 0.4164599352040314,
"mc2_stderr": 0.014438880882779618
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
yzhuang/autotree_automl_bank-marketing_gosdt_l512_d3_sd3 | 2023-08-25T20:24:30.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 5538400000
num_examples: 100000
- name: validation
num_bytes: 553840000
num_examples: 10000
download_size: 811018150
dataset_size: 6092240000
---
# Dataset Card for "autotree_automl_bank-marketing_gosdt_l512_d3_sd3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
exposetobacco/tweets_sentiment | 2023-08-25T20:37:20.000Z | [
"region:us"
] | exposetobacco | null | null | null | 0 | 0 | That is tweets dataset with three sentiments namely 0: negative, 1: neutral, 2: positive.
It is suitable for fine tuning LLM for tweets sentiment analysis.
|
7777yeet/VNS | 2023-08-25T20:43:13.000Z | [
"region:us"
] | 7777yeet | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_porkorbeef__Llama-2-13b | 2023-09-24T15:59:18.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of porkorbeef/Llama-2-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [porkorbeef/Llama-2-13b](https://huggingface.co/porkorbeef/Llama-2-13b) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_porkorbeef__Llama-2-13b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-24T15:59:06.567352](https://huggingface.co/datasets/open-llm-leaderboard/details_porkorbeef__Llama-2-13b/blob/main/results_2023-09-24T15-59-06.567352.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0,\n \"\
em_stderr\": 0.0,\n \"f1\": 4.404362416107384e-05,\n \"f1_stderr\"\
: 1.350418751210094e-05,\n \"acc\": 0.2584846093133386,\n \"acc_stderr\"\
: 0.007022195200806489\n },\n \"harness|drop|3\": {\n \"em\": 0.0,\n\
\ \"em_stderr\": 0.0,\n \"f1\": 4.404362416107384e-05,\n \"\
f1_stderr\": 1.350418751210094e-05\n },\n \"harness|gsm8k|5\": {\n \
\ \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5169692186266772,\n \"acc_stderr\": 0.014044390401612978\n\
\ }\n}\n```"
repo_url: https://huggingface.co/porkorbeef/Llama-2-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|arc:challenge|25_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_24T15_59_06.567352
path:
- '**/details_harness|drop|3_2023-09-24T15-59-06.567352.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-24T15-59-06.567352.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_24T15_59_06.567352
path:
- '**/details_harness|gsm8k|5_2023-09-24T15-59-06.567352.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-24T15-59-06.567352.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hellaswag|10_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T20:46:35.399741.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T20:46:35.399741.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T20:46:35.399741.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_24T15_59_06.567352
path:
- '**/details_harness|winogrande|5_2023-09-24T15-59-06.567352.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-24T15-59-06.567352.parquet'
- config_name: results
data_files:
- split: 2023_08_25T20_46_35.399741
path:
- results_2023-08-25T20:46:35.399741.parquet
- split: 2023_09_24T15_59_06.567352
path:
- results_2023-09-24T15-59-06.567352.parquet
- split: latest
path:
- results_2023-09-24T15-59-06.567352.parquet
---
# Dataset Card for Evaluation run of porkorbeef/Llama-2-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/porkorbeef/Llama-2-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [porkorbeef/Llama-2-13b](https://huggingface.co/porkorbeef/Llama-2-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_porkorbeef__Llama-2-13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-24T15:59:06.567352](https://huggingface.co/datasets/open-llm-leaderboard/details_porkorbeef__Llama-2-13b/blob/main/results_2023-09-24T15-59-06.567352.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0,
"em_stderr": 0.0,
"f1": 4.404362416107384e-05,
"f1_stderr": 1.350418751210094e-05,
"acc": 0.2584846093133386,
"acc_stderr": 0.007022195200806489
},
"harness|drop|3": {
"em": 0.0,
"em_stderr": 0.0,
"f1": 4.404362416107384e-05,
"f1_stderr": 1.350418751210094e-05
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5169692186266772,
"acc_stderr": 0.014044390401612978
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_OpenBuddy__openbuddy-llama2-13b-v11.1-bf16 | 2023-09-22T09:18:24.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of OpenBuddy/openbuddy-llama2-13b-v11.1-bf16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [OpenBuddy/openbuddy-llama2-13b-v11.1-bf16](https://huggingface.co/OpenBuddy/openbuddy-llama2-13b-v11.1-bf16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenBuddy__openbuddy-llama2-13b-v11.1-bf16\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-22T09:17:00.712298](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-llama2-13b-v11.1-bf16/blob/main/results_2023-09-22T09-17-00.712298.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5631473079976127,\n\
\ \"acc_stderr\": 0.03458635153276765,\n \"acc_norm\": 0.56700982666984,\n\
\ \"acc_norm_stderr\": 0.034574474410597564,\n \"mc1\": 0.3353733170134639,\n\
\ \"mc1_stderr\": 0.01652753403966899,\n \"mc2\": 0.4970271534720991,\n\
\ \"mc2_stderr\": 0.015285520595244436\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4854948805460751,\n \"acc_stderr\": 0.014605241081370053,\n\
\ \"acc_norm\": 0.5162116040955631,\n \"acc_norm_stderr\": 0.01460370856741495\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5651264688309102,\n\
\ \"acc_stderr\": 0.00494727245422621,\n \"acc_norm\": 0.76229834694284,\n\
\ \"acc_norm_stderr\": 0.004248054760146077\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.0404633688397825,\n\
\ \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.0404633688397825\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6226415094339622,\n \"acc_stderr\": 0.029832808114796005,\n\
\ \"acc_norm\": 0.6226415094339622,\n \"acc_norm_stderr\": 0.029832808114796005\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5902777777777778,\n\
\ \"acc_stderr\": 0.04112490974670787,\n \"acc_norm\": 0.5902777777777778,\n\
\ \"acc_norm_stderr\": 0.04112490974670787\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5144508670520231,\n\
\ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.5144508670520231,\n\
\ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4425531914893617,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.4425531914893617,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n\
\ \"acc_stderr\": 0.04462917535336937,\n \"acc_norm\": 0.34210526315789475,\n\
\ \"acc_norm_stderr\": 0.04462917535336937\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37037037037037035,\n \"acc_stderr\": 0.024870815251057093,\n \"\
acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.024870815251057093\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6774193548387096,\n\
\ \"acc_stderr\": 0.026593084516572264,\n \"acc_norm\": 0.6774193548387096,\n\
\ \"acc_norm_stderr\": 0.026593084516572264\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.46798029556650245,\n \"acc_stderr\": 0.035107665979592154,\n\
\ \"acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.035107665979592154\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.696969696969697,\n \"acc_stderr\": 0.03588624800091706,\n\
\ \"acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.03588624800091706\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7373737373737373,\n \"acc_stderr\": 0.03135305009533084,\n \"\
acc_norm\": 0.7373737373737373,\n \"acc_norm_stderr\": 0.03135305009533084\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8031088082901554,\n \"acc_stderr\": 0.028697873971860677,\n\
\ \"acc_norm\": 0.8031088082901554,\n \"acc_norm_stderr\": 0.028697873971860677\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5025641025641026,\n \"acc_stderr\": 0.025350672979412195,\n\
\ \"acc_norm\": 0.5025641025641026,\n \"acc_norm_stderr\": 0.025350672979412195\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524582,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524582\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5756302521008403,\n \"acc_stderr\": 0.032104790510157764,\n\
\ \"acc_norm\": 0.5756302521008403,\n \"acc_norm_stderr\": 0.032104790510157764\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7486238532110092,\n \"acc_stderr\": 0.018599206360287415,\n \"\
acc_norm\": 0.7486238532110092,\n \"acc_norm_stderr\": 0.018599206360287415\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7156862745098039,\n \"acc_stderr\": 0.031660096793998116,\n \"\
acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.031660096793998116\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7215189873417721,\n \"acc_stderr\": 0.029178682304842544,\n \
\ \"acc_norm\": 0.7215189873417721,\n \"acc_norm_stderr\": 0.029178682304842544\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n\
\ \"acc_stderr\": 0.032277904428505,\n \"acc_norm\": 0.6367713004484304,\n\
\ \"acc_norm_stderr\": 0.032277904428505\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6641221374045801,\n \"acc_stderr\": 0.041423137719966634,\n\
\ \"acc_norm\": 0.6641221374045801,\n \"acc_norm_stderr\": 0.041423137719966634\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094632,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094632\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.035590395316173425,\n\
\ \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.035590395316173425\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n\
\ \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.36607142857142855,\n\
\ \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.04498676320572922,\n\
\ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.04498676320572922\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8376068376068376,\n\
\ \"acc_stderr\": 0.024161618127987745,\n \"acc_norm\": 0.8376068376068376,\n\
\ \"acc_norm_stderr\": 0.024161618127987745\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7496807151979565,\n\
\ \"acc_stderr\": 0.015491088951494578,\n \"acc_norm\": 0.7496807151979565,\n\
\ \"acc_norm_stderr\": 0.015491088951494578\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6271676300578035,\n \"acc_stderr\": 0.026033890613576277,\n\
\ \"acc_norm\": 0.6271676300578035,\n \"acc_norm_stderr\": 0.026033890613576277\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33519553072625696,\n\
\ \"acc_stderr\": 0.015788007190185884,\n \"acc_norm\": 0.33519553072625696,\n\
\ \"acc_norm_stderr\": 0.015788007190185884\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6241830065359477,\n \"acc_stderr\": 0.02773283435336394,\n\
\ \"acc_norm\": 0.6241830065359477,\n \"acc_norm_stderr\": 0.02773283435336394\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6141479099678456,\n\
\ \"acc_stderr\": 0.02764814959975147,\n \"acc_norm\": 0.6141479099678456,\n\
\ \"acc_norm_stderr\": 0.02764814959975147\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.027513747284379424,\n\
\ \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.027513747284379424\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.41843971631205673,\n \"acc_stderr\": 0.029427994039419994,\n \
\ \"acc_norm\": 0.41843971631205673,\n \"acc_norm_stderr\": 0.029427994039419994\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4172099087353325,\n\
\ \"acc_stderr\": 0.012593959992906422,\n \"acc_norm\": 0.4172099087353325,\n\
\ \"acc_norm_stderr\": 0.012593959992906422\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5147058823529411,\n \"acc_stderr\": 0.03035969707904611,\n\
\ \"acc_norm\": 0.5147058823529411,\n \"acc_norm_stderr\": 0.03035969707904611\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5473856209150327,\n \"acc_stderr\": 0.020136790918492527,\n \
\ \"acc_norm\": 0.5473856209150327,\n \"acc_norm_stderr\": 0.020136790918492527\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6612244897959184,\n \"acc_stderr\": 0.030299506562154188,\n\
\ \"acc_norm\": 0.6612244897959184,\n \"acc_norm_stderr\": 0.030299506562154188\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7412935323383084,\n\
\ \"acc_stderr\": 0.03096590312357302,\n \"acc_norm\": 0.7412935323383084,\n\
\ \"acc_norm_stderr\": 0.03096590312357302\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932264,\n \
\ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932264\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7251461988304093,\n \"acc_stderr\": 0.03424042924691584,\n\
\ \"acc_norm\": 0.7251461988304093,\n \"acc_norm_stderr\": 0.03424042924691584\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3353733170134639,\n\
\ \"mc1_stderr\": 0.01652753403966899,\n \"mc2\": 0.4970271534720991,\n\
\ \"mc2_stderr\": 0.015285520595244436\n }\n}\n```"
repo_url: https://huggingface.co/OpenBuddy/openbuddy-llama2-13b-v11.1-bf16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|arc:challenge|25_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|arc:challenge|25_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hellaswag|10_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hellaswag|10_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T20:49:31.940231.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T09-17-00.712298.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-22T09-17-00.712298.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T20:49:31.940231.parquet'
- split: 2023_09_22T09_17_00.712298
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T09-17-00.712298.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-22T09-17-00.712298.parquet'
- config_name: results
data_files:
- split: 2023_08_25T20_49_31.940231
path:
- results_2023-08-25T20:49:31.940231.parquet
- split: 2023_09_22T09_17_00.712298
path:
- results_2023-09-22T09-17-00.712298.parquet
- split: latest
path:
- results_2023-09-22T09-17-00.712298.parquet
---
# Dataset Card for Evaluation run of OpenBuddy/openbuddy-llama2-13b-v11.1-bf16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/OpenBuddy/openbuddy-llama2-13b-v11.1-bf16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [OpenBuddy/openbuddy-llama2-13b-v11.1-bf16](https://huggingface.co/OpenBuddy/openbuddy-llama2-13b-v11.1-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OpenBuddy__openbuddy-llama2-13b-v11.1-bf16",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T09:17:00.712298](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-llama2-13b-v11.1-bf16/blob/main/results_2023-09-22T09-17-00.712298.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5631473079976127,
"acc_stderr": 0.03458635153276765,
"acc_norm": 0.56700982666984,
"acc_norm_stderr": 0.034574474410597564,
"mc1": 0.3353733170134639,
"mc1_stderr": 0.01652753403966899,
"mc2": 0.4970271534720991,
"mc2_stderr": 0.015285520595244436
},
"harness|arc:challenge|25": {
"acc": 0.4854948805460751,
"acc_stderr": 0.014605241081370053,
"acc_norm": 0.5162116040955631,
"acc_norm_stderr": 0.01460370856741495
},
"harness|hellaswag|10": {
"acc": 0.5651264688309102,
"acc_stderr": 0.00494727245422621,
"acc_norm": 0.76229834694284,
"acc_norm_stderr": 0.004248054760146077
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.0404633688397825,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.0404633688397825
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6226415094339622,
"acc_stderr": 0.029832808114796005,
"acc_norm": 0.6226415094339622,
"acc_norm_stderr": 0.029832808114796005
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5902777777777778,
"acc_stderr": 0.04112490974670787,
"acc_norm": 0.5902777777777778,
"acc_norm_stderr": 0.04112490974670787
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4425531914893617,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.4425531914893617,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.04462917535336937,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.04462917535336937
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.024870815251057093,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.024870815251057093
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6774193548387096,
"acc_stderr": 0.026593084516572264,
"acc_norm": 0.6774193548387096,
"acc_norm_stderr": 0.026593084516572264
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.46798029556650245,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.46798029556650245,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.03588624800091706,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.03588624800091706
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7373737373737373,
"acc_stderr": 0.03135305009533084,
"acc_norm": 0.7373737373737373,
"acc_norm_stderr": 0.03135305009533084
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8031088082901554,
"acc_stderr": 0.028697873971860677,
"acc_norm": 0.8031088082901554,
"acc_norm_stderr": 0.028697873971860677
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5025641025641026,
"acc_stderr": 0.025350672979412195,
"acc_norm": 0.5025641025641026,
"acc_norm_stderr": 0.025350672979412195
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524582,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524582
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5756302521008403,
"acc_stderr": 0.032104790510157764,
"acc_norm": 0.5756302521008403,
"acc_norm_stderr": 0.032104790510157764
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7486238532110092,
"acc_stderr": 0.018599206360287415,
"acc_norm": 0.7486238532110092,
"acc_norm_stderr": 0.018599206360287415
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.031660096793998116,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.031660096793998116
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7215189873417721,
"acc_stderr": 0.029178682304842544,
"acc_norm": 0.7215189873417721,
"acc_norm_stderr": 0.029178682304842544
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6367713004484304,
"acc_stderr": 0.032277904428505,
"acc_norm": 0.6367713004484304,
"acc_norm_stderr": 0.032277904428505
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6641221374045801,
"acc_stderr": 0.041423137719966634,
"acc_norm": 0.6641221374045801,
"acc_norm_stderr": 0.041423137719966634
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094632,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094632
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7116564417177914,
"acc_stderr": 0.035590395316173425,
"acc_norm": 0.7116564417177914,
"acc_norm_stderr": 0.035590395316173425
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.0457237235873743,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.0457237235873743
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.04498676320572922,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.04498676320572922
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8376068376068376,
"acc_stderr": 0.024161618127987745,
"acc_norm": 0.8376068376068376,
"acc_norm_stderr": 0.024161618127987745
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7496807151979565,
"acc_stderr": 0.015491088951494578,
"acc_norm": 0.7496807151979565,
"acc_norm_stderr": 0.015491088951494578
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6271676300578035,
"acc_stderr": 0.026033890613576277,
"acc_norm": 0.6271676300578035,
"acc_norm_stderr": 0.026033890613576277
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.33519553072625696,
"acc_stderr": 0.015788007190185884,
"acc_norm": 0.33519553072625696,
"acc_norm_stderr": 0.015788007190185884
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6241830065359477,
"acc_stderr": 0.02773283435336394,
"acc_norm": 0.6241830065359477,
"acc_norm_stderr": 0.02773283435336394
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6141479099678456,
"acc_stderr": 0.02764814959975147,
"acc_norm": 0.6141479099678456,
"acc_norm_stderr": 0.02764814959975147
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.027513747284379424,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.027513747284379424
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.41843971631205673,
"acc_stderr": 0.029427994039419994,
"acc_norm": 0.41843971631205673,
"acc_norm_stderr": 0.029427994039419994
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4172099087353325,
"acc_stderr": 0.012593959992906422,
"acc_norm": 0.4172099087353325,
"acc_norm_stderr": 0.012593959992906422
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5147058823529411,
"acc_stderr": 0.03035969707904611,
"acc_norm": 0.5147058823529411,
"acc_norm_stderr": 0.03035969707904611
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5473856209150327,
"acc_stderr": 0.020136790918492527,
"acc_norm": 0.5473856209150327,
"acc_norm_stderr": 0.020136790918492527
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6612244897959184,
"acc_stderr": 0.030299506562154188,
"acc_norm": 0.6612244897959184,
"acc_norm_stderr": 0.030299506562154188
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7412935323383084,
"acc_stderr": 0.03096590312357302,
"acc_norm": 0.7412935323383084,
"acc_norm_stderr": 0.03096590312357302
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932264,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932264
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866766,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866766
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7251461988304093,
"acc_stderr": 0.03424042924691584,
"acc_norm": 0.7251461988304093,
"acc_norm_stderr": 0.03424042924691584
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3353733170134639,
"mc1_stderr": 0.01652753403966899,
"mc2": 0.4970271534720991,
"mc2_stderr": 0.015285520595244436
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_IGeniusDev__llama13B-quant8-testv1-openorca-customdataset | 2023-09-22T22:58:23.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of IGeniusDev/llama13B-quant8-testv1-openorca-customdataset
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [IGeniusDev/llama13B-quant8-testv1-openorca-customdataset](https://huggingface.co/IGeniusDev/llama13B-quant8-testv1-openorca-customdataset)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_IGeniusDev__llama13B-quant8-testv1-openorca-customdataset\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T22:58:12.026440](https://huggingface.co/datasets/open-llm-leaderboard/details_IGeniusDev__llama13B-quant8-testv1-openorca-customdataset/blob/main/results_2023-09-22T22-58-12.026440.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001363255033557047,\n\
\ \"em_stderr\": 0.0003778609196460785,\n \"f1\": 0.061706166107382464,\n\
\ \"f1_stderr\": 0.0013539743732266353,\n \"acc\": 0.4288700212365805,\n\
\ \"acc_stderr\": 0.010174848411278824\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.001363255033557047,\n \"em_stderr\": 0.0003778609196460785,\n\
\ \"f1\": 0.061706166107382464,\n \"f1_stderr\": 0.0013539743732266353\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10083396512509477,\n \
\ \"acc_stderr\": 0.008294031192126607\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7569060773480663,\n \"acc_stderr\": 0.012055665630431043\n\
\ }\n}\n```"
repo_url: https://huggingface.co/IGeniusDev/llama13B-quant8-testv1-openorca-customdataset
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|arc:challenge|25_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|arc:challenge|25_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T22_58_12.026440
path:
- '**/details_harness|drop|3_2023-09-22T22-58-12.026440.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T22-58-12.026440.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T22_58_12.026440
path:
- '**/details_harness|gsm8k|5_2023-09-22T22-58-12.026440.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T22-58-12.026440.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hellaswag|10_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hellaswag|10_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T21:11:04.087350.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T19:50:33.159802.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T21:11:04.087350.parquet'
- split: 2023_08_28T19_50_33.159802
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-28T19:50:33.159802.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-28T19:50:33.159802.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T22_58_12.026440
path:
- '**/details_harness|winogrande|5_2023-09-22T22-58-12.026440.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T22-58-12.026440.parquet'
- config_name: results
data_files:
- split: 2023_08_25T21_11_04.087350
path:
- results_2023-08-25T21:11:04.087350.parquet
- split: 2023_08_28T19_50_33.159802
path:
- results_2023-08-28T19:50:33.159802.parquet
- split: 2023_09_22T22_58_12.026440
path:
- results_2023-09-22T22-58-12.026440.parquet
- split: latest
path:
- results_2023-09-22T22-58-12.026440.parquet
---
# Dataset Card for Evaluation run of IGeniusDev/llama13B-quant8-testv1-openorca-customdataset
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/IGeniusDev/llama13B-quant8-testv1-openorca-customdataset
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [IGeniusDev/llama13B-quant8-testv1-openorca-customdataset](https://huggingface.co/IGeniusDev/llama13B-quant8-testv1-openorca-customdataset) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_IGeniusDev__llama13B-quant8-testv1-openorca-customdataset",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T22:58:12.026440](https://huggingface.co/datasets/open-llm-leaderboard/details_IGeniusDev__llama13B-quant8-testv1-openorca-customdataset/blob/main/results_2023-09-22T22-58-12.026440.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001363255033557047,
"em_stderr": 0.0003778609196460785,
"f1": 0.061706166107382464,
"f1_stderr": 0.0013539743732266353,
"acc": 0.4288700212365805,
"acc_stderr": 0.010174848411278824
},
"harness|drop|3": {
"em": 0.001363255033557047,
"em_stderr": 0.0003778609196460785,
"f1": 0.061706166107382464,
"f1_stderr": 0.0013539743732266353
},
"harness|gsm8k|5": {
"acc": 0.10083396512509477,
"acc_stderr": 0.008294031192126607
},
"harness|winogrande|5": {
"acc": 0.7569060773480663,
"acc_stderr": 0.012055665630431043
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
JeremyMoore/oida | 2023-08-25T21:34:55.000Z | [
"region:us"
] | JeremyMoore | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_TaylorAI__Flash-Llama-7B | 2023-08-27T12:42:51.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TaylorAI/Flash-Llama-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TaylorAI/Flash-Llama-7B](https://huggingface.co/TaylorAI/Flash-Llama-7B) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TaylorAI__Flash-Llama-7B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-25T21:56:25.848117](https://huggingface.co/datasets/open-llm-leaderboard/details_TaylorAI__Flash-Llama-7B/blob/main/results_2023-08-25T21%3A56%3A25.848117.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.47043597201725107,\n\
\ \"acc_stderr\": 0.03529263908245757,\n \"acc_norm\": 0.47444479585837357,\n\
\ \"acc_norm_stderr\": 0.03527837427331349,\n \"mc1\": 0.2484700122399021,\n\
\ \"mc1_stderr\": 0.01512742709652068,\n \"mc2\": 0.3875084099562216,\n\
\ \"mc2_stderr\": 0.013510147651392562\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.49146757679180886,\n \"acc_stderr\": 0.01460926316563219,\n\
\ \"acc_norm\": 0.5307167235494881,\n \"acc_norm_stderr\": 0.014583792546304037\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5884285998805019,\n\
\ \"acc_stderr\": 0.0049111251010646425,\n \"acc_norm\": 0.785700059749054,\n\
\ \"acc_norm_stderr\": 0.004094971980892084\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.40789473684210525,\n \"acc_stderr\": 0.03999309712777471,\n\
\ \"acc_norm\": 0.40789473684210525,\n \"acc_norm_stderr\": 0.03999309712777471\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4641509433962264,\n \"acc_stderr\": 0.030693675018458003,\n\
\ \"acc_norm\": 0.4641509433962264,\n \"acc_norm_stderr\": 0.030693675018458003\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4652777777777778,\n\
\ \"acc_stderr\": 0.04171115858181618,\n \"acc_norm\": 0.4652777777777778,\n\
\ \"acc_norm_stderr\": 0.04171115858181618\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.33,\n\
\ \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \
\ \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4277456647398844,\n\
\ \"acc_stderr\": 0.037724468575180255,\n \"acc_norm\": 0.4277456647398844,\n\
\ \"acc_norm_stderr\": 0.037724468575180255\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364395,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364395\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.42127659574468085,\n \"acc_stderr\": 0.03227834510146267,\n\
\ \"acc_norm\": 0.42127659574468085,\n \"acc_norm_stderr\": 0.03227834510146267\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322004,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322004\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n\
\ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2671957671957672,\n \"acc_stderr\": 0.02278967314577656,\n \"\
acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.02278967314577656\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.0404061017820884,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.0404061017820884\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.028444006199428714,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.028444006199428714\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3645320197044335,\n \"acc_stderr\": 0.033864057460620905,\n\
\ \"acc_norm\": 0.3645320197044335,\n \"acc_norm_stderr\": 0.033864057460620905\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.593939393939394,\n \"acc_stderr\": 0.03834816355401181,\n\
\ \"acc_norm\": 0.593939393939394,\n \"acc_norm_stderr\": 0.03834816355401181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4898989898989899,\n \"acc_stderr\": 0.03561625488673745,\n \"\
acc_norm\": 0.4898989898989899,\n \"acc_norm_stderr\": 0.03561625488673745\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6787564766839378,\n \"acc_stderr\": 0.033699508685490674,\n\
\ \"acc_norm\": 0.6787564766839378,\n \"acc_norm_stderr\": 0.033699508685490674\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.45897435897435895,\n \"acc_stderr\": 0.025265525491284295,\n\
\ \"acc_norm\": 0.45897435897435895,\n \"acc_norm_stderr\": 0.025265525491284295\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228416,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228416\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.0322529423239964,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.0322529423239964\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.037804458505267334,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.037804458505267334\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6311926605504588,\n \"acc_stderr\": 0.020686227560729555,\n \"\
acc_norm\": 0.6311926605504588,\n \"acc_norm_stderr\": 0.020686227560729555\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.27314814814814814,\n \"acc_stderr\": 0.03038805130167812,\n \"\
acc_norm\": 0.27314814814814814,\n \"acc_norm_stderr\": 0.03038805130167812\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5441176470588235,\n \"acc_stderr\": 0.03495624522015476,\n \"\
acc_norm\": 0.5441176470588235,\n \"acc_norm_stderr\": 0.03495624522015476\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6329113924050633,\n \"acc_stderr\": 0.031376240725616185,\n \
\ \"acc_norm\": 0.6329113924050633,\n \"acc_norm_stderr\": 0.031376240725616185\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5650224215246636,\n\
\ \"acc_stderr\": 0.033272833702713445,\n \"acc_norm\": 0.5650224215246636,\n\
\ \"acc_norm_stderr\": 0.033272833702713445\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5648854961832062,\n \"acc_stderr\": 0.04348208051644858,\n\
\ \"acc_norm\": 0.5648854961832062,\n \"acc_norm_stderr\": 0.04348208051644858\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6528925619834711,\n \"acc_stderr\": 0.043457245702925335,\n \"\
acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.043457245702925335\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5370370370370371,\n\
\ \"acc_stderr\": 0.04820403072760628,\n \"acc_norm\": 0.5370370370370371,\n\
\ \"acc_norm_stderr\": 0.04820403072760628\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5153374233128835,\n \"acc_stderr\": 0.03926522378708843,\n\
\ \"acc_norm\": 0.5153374233128835,\n \"acc_norm_stderr\": 0.03926522378708843\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5533980582524272,\n \"acc_stderr\": 0.04922424153458933,\n\
\ \"acc_norm\": 0.5533980582524272,\n \"acc_norm_stderr\": 0.04922424153458933\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6923076923076923,\n\
\ \"acc_stderr\": 0.030236389942173085,\n \"acc_norm\": 0.6923076923076923,\n\
\ \"acc_norm_stderr\": 0.030236389942173085\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6398467432950191,\n\
\ \"acc_stderr\": 0.017166362471369306,\n \"acc_norm\": 0.6398467432950191,\n\
\ \"acc_norm_stderr\": 0.017166362471369306\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.49421965317919075,\n \"acc_stderr\": 0.026917296179149116,\n\
\ \"acc_norm\": 0.49421965317919075,\n \"acc_norm_stderr\": 0.026917296179149116\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n\
\ \"acc_stderr\": 0.014265554192331144,\n \"acc_norm\": 0.23910614525139665,\n\
\ \"acc_norm_stderr\": 0.014265554192331144\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.49673202614379086,\n \"acc_stderr\": 0.02862930519400354,\n\
\ \"acc_norm\": 0.49673202614379086,\n \"acc_norm_stderr\": 0.02862930519400354\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6012861736334405,\n\
\ \"acc_stderr\": 0.0278093225857745,\n \"acc_norm\": 0.6012861736334405,\n\
\ \"acc_norm_stderr\": 0.0278093225857745\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4876543209876543,\n \"acc_stderr\": 0.027812262269327228,\n\
\ \"acc_norm\": 0.4876543209876543,\n \"acc_norm_stderr\": 0.027812262269327228\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3617021276595745,\n \"acc_stderr\": 0.028663820147199492,\n \
\ \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.028663820147199492\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.36114732724902215,\n\
\ \"acc_stderr\": 0.01226793547751903,\n \"acc_norm\": 0.36114732724902215,\n\
\ \"acc_norm_stderr\": 0.01226793547751903\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5257352941176471,\n \"acc_stderr\": 0.03033257809455504,\n\
\ \"acc_norm\": 0.5257352941176471,\n \"acc_norm_stderr\": 0.03033257809455504\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4411764705882353,\n \"acc_stderr\": 0.020087362076702857,\n \
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.020087362076702857\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n\
\ \"acc_stderr\": 0.04782001791380061,\n \"acc_norm\": 0.5272727272727272,\n\
\ \"acc_norm_stderr\": 0.04782001791380061\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4775510204081633,\n \"acc_stderr\": 0.031976941187136725,\n\
\ \"acc_norm\": 0.4775510204081633,\n \"acc_norm_stderr\": 0.031976941187136725\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6318407960199005,\n\
\ \"acc_stderr\": 0.03410410565495301,\n \"acc_norm\": 0.6318407960199005,\n\
\ \"acc_norm_stderr\": 0.03410410565495301\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n\
\ \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n\
\ \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7017543859649122,\n \"acc_stderr\": 0.03508771929824563,\n\
\ \"acc_norm\": 0.7017543859649122,\n \"acc_norm_stderr\": 0.03508771929824563\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2484700122399021,\n\
\ \"mc1_stderr\": 0.01512742709652068,\n \"mc2\": 0.3875084099562216,\n\
\ \"mc2_stderr\": 0.013510147651392562\n }\n}\n```"
repo_url: https://huggingface.co/TaylorAI/Flash-Llama-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|arc:challenge|25_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hellaswag|10_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T21:56:25.848117.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T21:56:25.848117.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T21:56:25.848117.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T21:56:25.848117.parquet'
- config_name: results
data_files:
- split: 2023_08_25T21_56_25.848117
path:
- results_2023-08-25T21:56:25.848117.parquet
- split: latest
path:
- results_2023-08-25T21:56:25.848117.parquet
---
# Dataset Card for Evaluation run of TaylorAI/Flash-Llama-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TaylorAI/Flash-Llama-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TaylorAI/Flash-Llama-7B](https://huggingface.co/TaylorAI/Flash-Llama-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TaylorAI__Flash-Llama-7B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-25T21:56:25.848117](https://huggingface.co/datasets/open-llm-leaderboard/details_TaylorAI__Flash-Llama-7B/blob/main/results_2023-08-25T21%3A56%3A25.848117.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.47043597201725107,
"acc_stderr": 0.03529263908245757,
"acc_norm": 0.47444479585837357,
"acc_norm_stderr": 0.03527837427331349,
"mc1": 0.2484700122399021,
"mc1_stderr": 0.01512742709652068,
"mc2": 0.3875084099562216,
"mc2_stderr": 0.013510147651392562
},
"harness|arc:challenge|25": {
"acc": 0.49146757679180886,
"acc_stderr": 0.01460926316563219,
"acc_norm": 0.5307167235494881,
"acc_norm_stderr": 0.014583792546304037
},
"harness|hellaswag|10": {
"acc": 0.5884285998805019,
"acc_stderr": 0.0049111251010646425,
"acc_norm": 0.785700059749054,
"acc_norm_stderr": 0.004094971980892084
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.40789473684210525,
"acc_stderr": 0.03999309712777471,
"acc_norm": 0.40789473684210525,
"acc_norm_stderr": 0.03999309712777471
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4641509433962264,
"acc_stderr": 0.030693675018458003,
"acc_norm": 0.4641509433962264,
"acc_norm_stderr": 0.030693675018458003
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4652777777777778,
"acc_stderr": 0.04171115858181618,
"acc_norm": 0.4652777777777778,
"acc_norm_stderr": 0.04171115858181618
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4277456647398844,
"acc_stderr": 0.037724468575180255,
"acc_norm": 0.4277456647398844,
"acc_norm_stderr": 0.037724468575180255
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364395,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364395
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.42127659574468085,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.42127659574468085,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322004,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322004
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.02278967314577656,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.02278967314577656
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0404061017820884,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0404061017820884
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5,
"acc_stderr": 0.028444006199428714,
"acc_norm": 0.5,
"acc_norm_stderr": 0.028444006199428714
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3645320197044335,
"acc_stderr": 0.033864057460620905,
"acc_norm": 0.3645320197044335,
"acc_norm_stderr": 0.033864057460620905
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.593939393939394,
"acc_stderr": 0.03834816355401181,
"acc_norm": 0.593939393939394,
"acc_norm_stderr": 0.03834816355401181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4898989898989899,
"acc_stderr": 0.03561625488673745,
"acc_norm": 0.4898989898989899,
"acc_norm_stderr": 0.03561625488673745
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6787564766839378,
"acc_stderr": 0.033699508685490674,
"acc_norm": 0.6787564766839378,
"acc_norm_stderr": 0.033699508685490674
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.45897435897435895,
"acc_stderr": 0.025265525491284295,
"acc_norm": 0.45897435897435895,
"acc_norm_stderr": 0.025265525491284295
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228416,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228416
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.0322529423239964,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.0322529423239964
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.037804458505267334,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.037804458505267334
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6311926605504588,
"acc_stderr": 0.020686227560729555,
"acc_norm": 0.6311926605504588,
"acc_norm_stderr": 0.020686227560729555
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.27314814814814814,
"acc_stderr": 0.03038805130167812,
"acc_norm": 0.27314814814814814,
"acc_norm_stderr": 0.03038805130167812
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5441176470588235,
"acc_stderr": 0.03495624522015476,
"acc_norm": 0.5441176470588235,
"acc_norm_stderr": 0.03495624522015476
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6329113924050633,
"acc_stderr": 0.031376240725616185,
"acc_norm": 0.6329113924050633,
"acc_norm_stderr": 0.031376240725616185
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5650224215246636,
"acc_stderr": 0.033272833702713445,
"acc_norm": 0.5650224215246636,
"acc_norm_stderr": 0.033272833702713445
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5648854961832062,
"acc_stderr": 0.04348208051644858,
"acc_norm": 0.5648854961832062,
"acc_norm_stderr": 0.04348208051644858
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6528925619834711,
"acc_stderr": 0.043457245702925335,
"acc_norm": 0.6528925619834711,
"acc_norm_stderr": 0.043457245702925335
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.04820403072760628,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.04820403072760628
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5153374233128835,
"acc_stderr": 0.03926522378708843,
"acc_norm": 0.5153374233128835,
"acc_norm_stderr": 0.03926522378708843
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.5533980582524272,
"acc_stderr": 0.04922424153458933,
"acc_norm": 0.5533980582524272,
"acc_norm_stderr": 0.04922424153458933
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6923076923076923,
"acc_stderr": 0.030236389942173085,
"acc_norm": 0.6923076923076923,
"acc_norm_stderr": 0.030236389942173085
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6398467432950191,
"acc_stderr": 0.017166362471369306,
"acc_norm": 0.6398467432950191,
"acc_norm_stderr": 0.017166362471369306
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.49421965317919075,
"acc_stderr": 0.026917296179149116,
"acc_norm": 0.49421965317919075,
"acc_norm_stderr": 0.026917296179149116
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331144,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331144
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.49673202614379086,
"acc_stderr": 0.02862930519400354,
"acc_norm": 0.49673202614379086,
"acc_norm_stderr": 0.02862930519400354
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6012861736334405,
"acc_stderr": 0.0278093225857745,
"acc_norm": 0.6012861736334405,
"acc_norm_stderr": 0.0278093225857745
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4876543209876543,
"acc_stderr": 0.027812262269327228,
"acc_norm": 0.4876543209876543,
"acc_norm_stderr": 0.027812262269327228
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3617021276595745,
"acc_stderr": 0.028663820147199492,
"acc_norm": 0.3617021276595745,
"acc_norm_stderr": 0.028663820147199492
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.36114732724902215,
"acc_stderr": 0.01226793547751903,
"acc_norm": 0.36114732724902215,
"acc_norm_stderr": 0.01226793547751903
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5257352941176471,
"acc_stderr": 0.03033257809455504,
"acc_norm": 0.5257352941176471,
"acc_norm_stderr": 0.03033257809455504
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.020087362076702857,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.020087362076702857
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5272727272727272,
"acc_stderr": 0.04782001791380061,
"acc_norm": 0.5272727272727272,
"acc_norm_stderr": 0.04782001791380061
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4775510204081633,
"acc_stderr": 0.031976941187136725,
"acc_norm": 0.4775510204081633,
"acc_norm_stderr": 0.031976941187136725
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6318407960199005,
"acc_stderr": 0.03410410565495301,
"acc_norm": 0.6318407960199005,
"acc_norm_stderr": 0.03410410565495301
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42168674698795183,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.42168674698795183,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7017543859649122,
"acc_stderr": 0.03508771929824563,
"acc_norm": 0.7017543859649122,
"acc_norm_stderr": 0.03508771929824563
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2484700122399021,
"mc1_stderr": 0.01512742709652068,
"mc2": 0.3875084099562216,
"mc2_stderr": 0.013510147651392562
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
huggingface-projects/utils | 2023-08-25T21:57:21.000Z | [
"region:us"
] | huggingface-projects | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_codellama__CodeLlama-7b-Python-hf | 2023-08-27T12:42:53.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of codellama/CodeLlama-7b-Python-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [codellama/CodeLlama-7b-Python-hf](https://huggingface.co/codellama/CodeLlama-7b-Python-hf)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_codellama__CodeLlama-7b-Python-hf\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-26T02:47:34.882651](https://huggingface.co/datasets/open-llm-leaderboard/details_codellama__CodeLlama-7b-Python-hf/blob/main/results_2023-08-26T02%3A47%3A34.882651.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2755308754114643,\n\
\ \"acc_stderr\": 0.03222805438246509,\n \"acc_norm\": 0.2781902042922924,\n\
\ \"acc_norm_stderr\": 0.03223662912853709,\n \"mc1\": 0.28151774785801714,\n\
\ \"mc1_stderr\": 0.01574402724825605,\n \"mc2\": 0.4221405339092182,\n\
\ \"mc2_stderr\": 0.014520275276983402\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.28071672354948807,\n \"acc_stderr\": 0.013131238126975578,\n\
\ \"acc_norm\": 0.31313993174061433,\n \"acc_norm_stderr\": 0.013552671543623503\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4041027683728341,\n\
\ \"acc_stderr\": 0.004897146690596259,\n \"acc_norm\": 0.5285799641505676,\n\
\ \"acc_norm_stderr\": 0.004981623292196192\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2074074074074074,\n\
\ \"acc_stderr\": 0.03502553170678318,\n \"acc_norm\": 0.2074074074074074,\n\
\ \"acc_norm_stderr\": 0.03502553170678318\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.0315469804508223,\n\
\ \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.0315469804508223\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2981132075471698,\n \"acc_stderr\": 0.02815283794249386,\n\
\ \"acc_norm\": 0.2981132075471698,\n \"acc_norm_stderr\": 0.02815283794249386\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3472222222222222,\n\
\ \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.3472222222222222,\n\
\ \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\
: 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24855491329479767,\n\
\ \"acc_stderr\": 0.03295304696818317,\n \"acc_norm\": 0.24855491329479767,\n\
\ \"acc_norm_stderr\": 0.03295304696818317\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006717,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006717\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3404255319148936,\n \"acc_stderr\": 0.030976692998534436,\n\
\ \"acc_norm\": 0.3404255319148936,\n \"acc_norm_stderr\": 0.030976692998534436\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813344,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813344\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23015873015873015,\n\
\ \"acc_stderr\": 0.03764950879790605,\n \"acc_norm\": 0.23015873015873015,\n\
\ \"acc_norm_stderr\": 0.03764950879790605\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.31290322580645163,\n \"acc_stderr\": 0.02637756702864586,\n \"\
acc_norm\": 0.31290322580645163,\n \"acc_norm_stderr\": 0.02637756702864586\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.24630541871921183,\n \"acc_stderr\": 0.030315099285617715,\n \"\
acc_norm\": 0.24630541871921183,\n \"acc_norm_stderr\": 0.030315099285617715\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\"\
: 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2545454545454545,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.30303030303030304,\n \"acc_stderr\": 0.032742879140268674,\n \"\
acc_norm\": 0.30303030303030304,\n \"acc_norm_stderr\": 0.032742879140268674\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.32124352331606215,\n \"acc_stderr\": 0.033699508685490674,\n\
\ \"acc_norm\": 0.32124352331606215,\n \"acc_norm_stderr\": 0.033699508685490674\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.022421273612923707,\n\
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.022421273612923707\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2222222222222222,\n \"acc_stderr\": 0.025348097468097835,\n \
\ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.025348097468097835\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3067226890756303,\n \"acc_stderr\": 0.02995382389188703,\n \
\ \"acc_norm\": 0.3067226890756303,\n \"acc_norm_stderr\": 0.02995382389188703\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.25165562913907286,\n \"acc_stderr\": 0.035433042343899844,\n \"\
acc_norm\": 0.25165562913907286,\n \"acc_norm_stderr\": 0.035433042343899844\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.26055045871559634,\n \"acc_stderr\": 0.018819182034850068,\n \"\
acc_norm\": 0.26055045871559634,\n \"acc_norm_stderr\": 0.018819182034850068\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.41203703703703703,\n \"acc_stderr\": 0.03356787758160835,\n \"\
acc_norm\": 0.41203703703703703,\n \"acc_norm_stderr\": 0.03356787758160835\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24509803921568626,\n \"acc_stderr\": 0.030190282453501947,\n \"\
acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.030190282453501947\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.22784810126582278,\n \"acc_stderr\": 0.02730348459906942,\n \
\ \"acc_norm\": 0.22784810126582278,\n \"acc_norm_stderr\": 0.02730348459906942\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3452914798206278,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.3452914798206278,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.23140495867768596,\n \"acc_stderr\": 0.03849856098794088,\n \"\
acc_norm\": 0.23140495867768596,\n \"acc_norm_stderr\": 0.03849856098794088\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.19444444444444445,\n\
\ \"acc_stderr\": 0.03826076324884864,\n \"acc_norm\": 0.19444444444444445,\n\
\ \"acc_norm_stderr\": 0.03826076324884864\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2392638036809816,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.2392638036809816,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.04521829902833585,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.04521829902833585\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.24271844660194175,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.24271844660194175,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2264957264957265,\n\
\ \"acc_stderr\": 0.027421007295392933,\n \"acc_norm\": 0.2264957264957265,\n\
\ \"acc_norm_stderr\": 0.027421007295392933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.31417624521072796,\n\
\ \"acc_stderr\": 0.016599291735884904,\n \"acc_norm\": 0.31417624521072796,\n\
\ \"acc_norm_stderr\": 0.016599291735884904\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.23410404624277456,\n \"acc_stderr\": 0.022797110278071124,\n\
\ \"acc_norm\": 0.23410404624277456,\n \"acc_norm_stderr\": 0.022797110278071124\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2581699346405229,\n \"acc_stderr\": 0.025058503316958167,\n\
\ \"acc_norm\": 0.2581699346405229,\n \"acc_norm_stderr\": 0.025058503316958167\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3086816720257235,\n\
\ \"acc_stderr\": 0.026236965881153262,\n \"acc_norm\": 0.3086816720257235,\n\
\ \"acc_norm_stderr\": 0.026236965881153262\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.20679012345679013,\n \"acc_stderr\": 0.022535006705942818,\n\
\ \"acc_norm\": 0.20679012345679013,\n \"acc_norm_stderr\": 0.022535006705942818\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.25886524822695034,\n \"acc_stderr\": 0.026129572527180848,\n \
\ \"acc_norm\": 0.25886524822695034,\n \"acc_norm_stderr\": 0.026129572527180848\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.242503259452412,\n\
\ \"acc_stderr\": 0.01094657096634878,\n \"acc_norm\": 0.242503259452412,\n\
\ \"acc_norm_stderr\": 0.01094657096634878\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.43014705882352944,\n \"acc_stderr\": 0.030074971917302875,\n\
\ \"acc_norm\": 0.43014705882352944,\n \"acc_norm_stderr\": 0.030074971917302875\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.23529411764705882,\n \"acc_stderr\": 0.01716058723504634,\n \
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.01716058723504634\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.23636363636363636,\n\
\ \"acc_stderr\": 0.04069306319721376,\n \"acc_norm\": 0.23636363636363636,\n\
\ \"acc_norm_stderr\": 0.04069306319721376\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.40408163265306124,\n \"acc_stderr\": 0.031414708025865885,\n\
\ \"acc_norm\": 0.40408163265306124,\n \"acc_norm_stderr\": 0.031414708025865885\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n\
\ \"acc_stderr\": 0.03036049015401464,\n \"acc_norm\": 0.24378109452736318,\n\
\ \"acc_norm_stderr\": 0.03036049015401464\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3192771084337349,\n\
\ \"acc_stderr\": 0.03629335329947861,\n \"acc_norm\": 0.3192771084337349,\n\
\ \"acc_norm_stderr\": 0.03629335329947861\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3684210526315789,\n \"acc_stderr\": 0.036996580176568775,\n\
\ \"acc_norm\": 0.3684210526315789,\n \"acc_norm_stderr\": 0.036996580176568775\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28151774785801714,\n\
\ \"mc1_stderr\": 0.01574402724825605,\n \"mc2\": 0.4221405339092182,\n\
\ \"mc2_stderr\": 0.014520275276983402\n }\n}\n```"
repo_url: https://huggingface.co/codellama/CodeLlama-7b-Python-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|arc:challenge|25_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|arc:challenge|25_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hellaswag|10_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hellaswag|10_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T22:02:01.262189.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T02:47:34.882651.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T02:47:34.882651.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T22:02:01.262189.parquet'
- split: 2023_08_26T02_47_34.882651
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T02:47:34.882651.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T02:47:34.882651.parquet'
- config_name: results
data_files:
- split: 2023_08_25T22_02_01.262189
path:
- results_2023-08-25T22:02:01.262189.parquet
- split: 2023_08_26T02_47_34.882651
path:
- results_2023-08-26T02:47:34.882651.parquet
- split: latest
path:
- results_2023-08-26T02:47:34.882651.parquet
---
# Dataset Card for Evaluation run of codellama/CodeLlama-7b-Python-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/codellama/CodeLlama-7b-Python-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [codellama/CodeLlama-7b-Python-hf](https://huggingface.co/codellama/CodeLlama-7b-Python-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_codellama__CodeLlama-7b-Python-hf",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-26T02:47:34.882651](https://huggingface.co/datasets/open-llm-leaderboard/details_codellama__CodeLlama-7b-Python-hf/blob/main/results_2023-08-26T02%3A47%3A34.882651.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2755308754114643,
"acc_stderr": 0.03222805438246509,
"acc_norm": 0.2781902042922924,
"acc_norm_stderr": 0.03223662912853709,
"mc1": 0.28151774785801714,
"mc1_stderr": 0.01574402724825605,
"mc2": 0.4221405339092182,
"mc2_stderr": 0.014520275276983402
},
"harness|arc:challenge|25": {
"acc": 0.28071672354948807,
"acc_stderr": 0.013131238126975578,
"acc_norm": 0.31313993174061433,
"acc_norm_stderr": 0.013552671543623503
},
"harness|hellaswag|10": {
"acc": 0.4041027683728341,
"acc_stderr": 0.004897146690596259,
"acc_norm": 0.5285799641505676,
"acc_norm_stderr": 0.004981623292196192
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2074074074074074,
"acc_stderr": 0.03502553170678318,
"acc_norm": 0.2074074074074074,
"acc_norm_stderr": 0.03502553170678318
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2981132075471698,
"acc_stderr": 0.02815283794249386,
"acc_norm": 0.2981132075471698,
"acc_norm_stderr": 0.02815283794249386
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3472222222222222,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.3472222222222222,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.03295304696818317,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.03295304696818317
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006717,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006717
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3404255319148936,
"acc_stderr": 0.030976692998534436,
"acc_norm": 0.3404255319148936,
"acc_norm_stderr": 0.030976692998534436
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813344,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813344
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23015873015873015,
"acc_stderr": 0.03764950879790605,
"acc_norm": 0.23015873015873015,
"acc_norm_stderr": 0.03764950879790605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.31290322580645163,
"acc_stderr": 0.02637756702864586,
"acc_norm": 0.31290322580645163,
"acc_norm_stderr": 0.02637756702864586
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.24630541871921183,
"acc_stderr": 0.030315099285617715,
"acc_norm": 0.24630541871921183,
"acc_norm_stderr": 0.030315099285617715
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.30303030303030304,
"acc_stderr": 0.032742879140268674,
"acc_norm": 0.30303030303030304,
"acc_norm_stderr": 0.032742879140268674
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.32124352331606215,
"acc_stderr": 0.033699508685490674,
"acc_norm": 0.32124352331606215,
"acc_norm_stderr": 0.033699508685490674
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.022421273612923707,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.022421273612923707
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.025348097468097835,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.025348097468097835
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3067226890756303,
"acc_stderr": 0.02995382389188703,
"acc_norm": 0.3067226890756303,
"acc_norm_stderr": 0.02995382389188703
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.25165562913907286,
"acc_stderr": 0.035433042343899844,
"acc_norm": 0.25165562913907286,
"acc_norm_stderr": 0.035433042343899844
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.26055045871559634,
"acc_stderr": 0.018819182034850068,
"acc_norm": 0.26055045871559634,
"acc_norm_stderr": 0.018819182034850068
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.41203703703703703,
"acc_stderr": 0.03356787758160835,
"acc_norm": 0.41203703703703703,
"acc_norm_stderr": 0.03356787758160835
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.030190282453501947,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.030190282453501947
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.22784810126582278,
"acc_stderr": 0.02730348459906942,
"acc_norm": 0.22784810126582278,
"acc_norm_stderr": 0.02730348459906942
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3452914798206278,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.3452914798206278,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.23140495867768596,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.23140495867768596,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.19444444444444445,
"acc_stderr": 0.03826076324884864,
"acc_norm": 0.19444444444444445,
"acc_norm_stderr": 0.03826076324884864
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2392638036809816,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.2392638036809816,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833585,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833585
},
"harness|hendrycksTest-management|5": {
"acc": 0.24271844660194175,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.24271844660194175,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2264957264957265,
"acc_stderr": 0.027421007295392933,
"acc_norm": 0.2264957264957265,
"acc_norm_stderr": 0.027421007295392933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.31417624521072796,
"acc_stderr": 0.016599291735884904,
"acc_norm": 0.31417624521072796,
"acc_norm_stderr": 0.016599291735884904
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.23410404624277456,
"acc_stderr": 0.022797110278071124,
"acc_norm": 0.23410404624277456,
"acc_norm_stderr": 0.022797110278071124
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2581699346405229,
"acc_stderr": 0.025058503316958167,
"acc_norm": 0.2581699346405229,
"acc_norm_stderr": 0.025058503316958167
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3086816720257235,
"acc_stderr": 0.026236965881153262,
"acc_norm": 0.3086816720257235,
"acc_norm_stderr": 0.026236965881153262
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.20679012345679013,
"acc_stderr": 0.022535006705942818,
"acc_norm": 0.20679012345679013,
"acc_norm_stderr": 0.022535006705942818
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25886524822695034,
"acc_stderr": 0.026129572527180848,
"acc_norm": 0.25886524822695034,
"acc_norm_stderr": 0.026129572527180848
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.242503259452412,
"acc_stderr": 0.01094657096634878,
"acc_norm": 0.242503259452412,
"acc_norm_stderr": 0.01094657096634878
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.43014705882352944,
"acc_stderr": 0.030074971917302875,
"acc_norm": 0.43014705882352944,
"acc_norm_stderr": 0.030074971917302875
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.01716058723504634,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.01716058723504634
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.04069306319721376,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.04069306319721376
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.40408163265306124,
"acc_stderr": 0.031414708025865885,
"acc_norm": 0.40408163265306124,
"acc_norm_stderr": 0.031414708025865885
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401464,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401464
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3192771084337349,
"acc_stderr": 0.03629335329947861,
"acc_norm": 0.3192771084337349,
"acc_norm_stderr": 0.03629335329947861
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.036996580176568775,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.036996580176568775
},
"harness|truthfulqa:mc|0": {
"mc1": 0.28151774785801714,
"mc1_stderr": 0.01574402724825605,
"mc2": 0.4221405339092182,
"mc2_stderr": 0.014520275276983402
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
exposetobacco/airline_reviews | 2023-08-25T22:08:32.000Z | [
"region:us"
] | exposetobacco | null | null | null | 0 | 0 | The dataset is derived from US airline reviews and it has three sentiments namely 0:negative, 1: neutral, 2: positive.
The dataset can be used in sentiment analysis and fine tuning LLMs. |
DRAGOO/dataset_dyal_darija | 2023-08-25T22:08:17.000Z | [
"region:us"
] | DRAGOO | null | null | null | 2 | 0 | Entry not found |
badokorach/transqa | 2023-08-25T22:55:02.000Z | [
"region:us"
] | badokorach | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_TaylorAI__Flash-Llama-3B | 2023-08-27T12:42:55.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TaylorAI/Flash-Llama-3B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TaylorAI/Flash-Llama-3B](https://huggingface.co/TaylorAI/Flash-Llama-3B) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TaylorAI__Flash-Llama-3B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-25T22:20:12.402392](https://huggingface.co/datasets/open-llm-leaderboard/details_TaylorAI__Flash-Llama-3B/blob/main/results_2023-08-25T22%3A20%3A12.402392.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.274738397715905,\n\
\ \"acc_stderr\": 0.03232898823630556,\n \"acc_norm\": 0.27862036671503904,\n\
\ \"acc_norm_stderr\": 0.0323262442083981,\n \"mc1\": 0.21542227662178703,\n\
\ \"mc1_stderr\": 0.014391902652427683,\n \"mc2\": 0.3473974860131068,\n\
\ \"mc2_stderr\": 0.013258021799654561\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3575085324232082,\n \"acc_stderr\": 0.014005494275916573,\n\
\ \"acc_norm\": 0.40102389078498296,\n \"acc_norm_stderr\": 0.014322255790719864\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5300736904999004,\n\
\ \"acc_stderr\": 0.004980747448813317,\n \"acc_norm\": 0.7155945030870344,\n\
\ \"acc_norm_stderr\": 0.004502088287470151\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.22962962962962963,\n\
\ \"acc_stderr\": 0.03633384414073464,\n \"acc_norm\": 0.22962962962962963,\n\
\ \"acc_norm_stderr\": 0.03633384414073464\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.27631578947368424,\n \"acc_stderr\": 0.03639057569952924,\n\
\ \"acc_norm\": 0.27631578947368424,\n \"acc_norm_stderr\": 0.03639057569952924\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.41,\n\
\ \"acc_stderr\": 0.04943110704237103,\n \"acc_norm\": 0.41,\n \
\ \"acc_norm_stderr\": 0.04943110704237103\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2679245283018868,\n \"acc_stderr\": 0.027257260322494845,\n\
\ \"acc_norm\": 0.2679245283018868,\n \"acc_norm_stderr\": 0.027257260322494845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23121387283236994,\n\
\ \"acc_stderr\": 0.03214737302029469,\n \"acc_norm\": 0.23121387283236994,\n\
\ \"acc_norm_stderr\": 0.03214737302029469\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617749,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617749\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.34,\n \"acc_stderr\": 0.047609522856952344,\n \"acc_norm\": 0.34,\n\
\ \"acc_norm_stderr\": 0.047609522856952344\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3148936170212766,\n \"acc_stderr\": 0.03036358219723817,\n\
\ \"acc_norm\": 0.3148936170212766,\n \"acc_norm_stderr\": 0.03036358219723817\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.04049339297748143,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.04049339297748143\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2206896551724138,\n \"acc_stderr\": 0.03455930201924811,\n\
\ \"acc_norm\": 0.2206896551724138,\n \"acc_norm_stderr\": 0.03455930201924811\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2962962962962963,\n \"acc_stderr\": 0.023517294335963286,\n \"\
acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.023517294335963286\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n\
\ \"acc_stderr\": 0.03852273364924318,\n \"acc_norm\": 0.24603174603174602,\n\
\ \"acc_norm_stderr\": 0.03852273364924318\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2129032258064516,\n\
\ \"acc_stderr\": 0.02328766512726853,\n \"acc_norm\": 0.2129032258064516,\n\
\ \"acc_norm_stderr\": 0.02328766512726853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.22167487684729065,\n \"acc_stderr\": 0.029225575892489614,\n\
\ \"acc_norm\": 0.22167487684729065,\n \"acc_norm_stderr\": 0.029225575892489614\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\"\
: 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.03453131801885416,\n\
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.03453131801885416\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.18686868686868688,\n \"acc_stderr\": 0.027772533334218977,\n \"\
acc_norm\": 0.18686868686868688,\n \"acc_norm_stderr\": 0.027772533334218977\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.23834196891191708,\n \"acc_stderr\": 0.03074890536390988,\n\
\ \"acc_norm\": 0.23834196891191708,\n \"acc_norm_stderr\": 0.03074890536390988\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.022421273612923724,\n\
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.022421273612923724\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2222222222222222,\n \"acc_stderr\": 0.02534809746809786,\n \
\ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.02534809746809786\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2689075630252101,\n \"acc_stderr\": 0.02880139219363128,\n \
\ \"acc_norm\": 0.2689075630252101,\n \"acc_norm_stderr\": 0.02880139219363128\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658753,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658753\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.24220183486238533,\n \"acc_stderr\": 0.018368176306598618,\n \"\
acc_norm\": 0.24220183486238533,\n \"acc_norm_stderr\": 0.018368176306598618\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.21296296296296297,\n \"acc_stderr\": 0.027920963147993666,\n \"\
acc_norm\": 0.21296296296296297,\n \"acc_norm_stderr\": 0.027920963147993666\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.22549019607843138,\n \"acc_stderr\": 0.029331162294251735,\n \"\
acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.029331162294251735\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2489451476793249,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.2489451476793249,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.38565022421524664,\n\
\ \"acc_stderr\": 0.03266842214289201,\n \"acc_norm\": 0.38565022421524664,\n\
\ \"acc_norm_stderr\": 0.03266842214289201\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.25190839694656486,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.25190839694656486,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2892561983471074,\n \"acc_stderr\": 0.04139112727635464,\n \"\
acc_norm\": 0.2892561983471074,\n \"acc_norm_stderr\": 0.04139112727635464\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.043300437496507437,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.043300437496507437\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2392638036809816,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.2392638036809816,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.043270409325787296,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.043270409325787296\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2815533980582524,\n \"acc_stderr\": 0.04453254836326469,\n\
\ \"acc_norm\": 0.2815533980582524,\n \"acc_norm_stderr\": 0.04453254836326469\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2692307692307692,\n\
\ \"acc_stderr\": 0.029058588303748845,\n \"acc_norm\": 0.2692307692307692,\n\
\ \"acc_norm_stderr\": 0.029058588303748845\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2771392081736909,\n\
\ \"acc_stderr\": 0.01600563629412243,\n \"acc_norm\": 0.2771392081736909,\n\
\ \"acc_norm_stderr\": 0.01600563629412243\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24022346368715083,\n\
\ \"acc_stderr\": 0.014288343803925308,\n \"acc_norm\": 0.24022346368715083,\n\
\ \"acc_norm_stderr\": 0.014288343803925308\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.024848018263875202,\n\
\ \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.024848018263875202\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2797427652733119,\n\
\ \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.2797427652733119,\n\
\ \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2932098765432099,\n \"acc_stderr\": 0.02532988817190092,\n\
\ \"acc_norm\": 0.2932098765432099,\n \"acc_norm_stderr\": 0.02532988817190092\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.30851063829787234,\n \"acc_stderr\": 0.02755336616510137,\n \
\ \"acc_norm\": 0.30851063829787234,\n \"acc_norm_stderr\": 0.02755336616510137\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24315514993481094,\n\
\ \"acc_stderr\": 0.010956556654417344,\n \"acc_norm\": 0.24315514993481094,\n\
\ \"acc_norm_stderr\": 0.010956556654417344\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.21691176470588236,\n \"acc_stderr\": 0.02503584522771125,\n\
\ \"acc_norm\": 0.21691176470588236,\n \"acc_norm_stderr\": 0.02503584522771125\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.27124183006535946,\n \"acc_stderr\": 0.01798661530403031,\n \
\ \"acc_norm\": 0.27124183006535946,\n \"acc_norm_stderr\": 0.01798661530403031\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2909090909090909,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.2909090909090909,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3224489795918367,\n \"acc_stderr\": 0.029923100563683906,\n\
\ \"acc_norm\": 0.3224489795918367,\n \"acc_norm_stderr\": 0.029923100563683906\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2736318407960199,\n\
\ \"acc_stderr\": 0.031524391865554016,\n \"acc_norm\": 0.2736318407960199,\n\
\ \"acc_norm_stderr\": 0.031524391865554016\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3373493975903614,\n\
\ \"acc_stderr\": 0.03680783690727581,\n \"acc_norm\": 0.3373493975903614,\n\
\ \"acc_norm_stderr\": 0.03680783690727581\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2982456140350877,\n \"acc_stderr\": 0.03508771929824563,\n\
\ \"acc_norm\": 0.2982456140350877,\n \"acc_norm_stderr\": 0.03508771929824563\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.21542227662178703,\n\
\ \"mc1_stderr\": 0.014391902652427683,\n \"mc2\": 0.3473974860131068,\n\
\ \"mc2_stderr\": 0.013258021799654561\n }\n}\n```"
repo_url: https://huggingface.co/TaylorAI/Flash-Llama-3B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|arc:challenge|25_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hellaswag|10_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T22:20:12.402392.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T22:20:12.402392.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T22:20:12.402392.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T22:20:12.402392.parquet'
- config_name: results
data_files:
- split: 2023_08_25T22_20_12.402392
path:
- results_2023-08-25T22:20:12.402392.parquet
- split: latest
path:
- results_2023-08-25T22:20:12.402392.parquet
---
# Dataset Card for Evaluation run of TaylorAI/Flash-Llama-3B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TaylorAI/Flash-Llama-3B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TaylorAI/Flash-Llama-3B](https://huggingface.co/TaylorAI/Flash-Llama-3B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TaylorAI__Flash-Llama-3B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-25T22:20:12.402392](https://huggingface.co/datasets/open-llm-leaderboard/details_TaylorAI__Flash-Llama-3B/blob/main/results_2023-08-25T22%3A20%3A12.402392.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.274738397715905,
"acc_stderr": 0.03232898823630556,
"acc_norm": 0.27862036671503904,
"acc_norm_stderr": 0.0323262442083981,
"mc1": 0.21542227662178703,
"mc1_stderr": 0.014391902652427683,
"mc2": 0.3473974860131068,
"mc2_stderr": 0.013258021799654561
},
"harness|arc:challenge|25": {
"acc": 0.3575085324232082,
"acc_stderr": 0.014005494275916573,
"acc_norm": 0.40102389078498296,
"acc_norm_stderr": 0.014322255790719864
},
"harness|hellaswag|10": {
"acc": 0.5300736904999004,
"acc_stderr": 0.004980747448813317,
"acc_norm": 0.7155945030870344,
"acc_norm_stderr": 0.004502088287470151
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.03633384414073464,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.03633384414073464
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.27631578947368424,
"acc_stderr": 0.03639057569952924,
"acc_norm": 0.27631578947368424,
"acc_norm_stderr": 0.03639057569952924
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237103,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237103
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2679245283018868,
"acc_stderr": 0.027257260322494845,
"acc_norm": 0.2679245283018868,
"acc_norm_stderr": 0.027257260322494845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23121387283236994,
"acc_stderr": 0.03214737302029469,
"acc_norm": 0.23121387283236994,
"acc_norm_stderr": 0.03214737302029469
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617749,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617749
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952344,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952344
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3148936170212766,
"acc_stderr": 0.03036358219723817,
"acc_norm": 0.3148936170212766,
"acc_norm_stderr": 0.03036358219723817
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748143,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748143
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2206896551724138,
"acc_stderr": 0.03455930201924811,
"acc_norm": 0.2206896551724138,
"acc_norm_stderr": 0.03455930201924811
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.023517294335963286,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.023517294335963286
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.03852273364924318,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.03852273364924318
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2129032258064516,
"acc_stderr": 0.02328766512726853,
"acc_norm": 0.2129032258064516,
"acc_norm_stderr": 0.02328766512726853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.22167487684729065,
"acc_stderr": 0.029225575892489614,
"acc_norm": 0.22167487684729065,
"acc_norm_stderr": 0.029225575892489614
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.03453131801885416,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.03453131801885416
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.18686868686868688,
"acc_stderr": 0.027772533334218977,
"acc_norm": 0.18686868686868688,
"acc_norm_stderr": 0.027772533334218977
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.23834196891191708,
"acc_stderr": 0.03074890536390988,
"acc_norm": 0.23834196891191708,
"acc_norm_stderr": 0.03074890536390988
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.022421273612923724,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.022421273612923724
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.02534809746809786,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.02534809746809786
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2689075630252101,
"acc_stderr": 0.02880139219363128,
"acc_norm": 0.2689075630252101,
"acc_norm_stderr": 0.02880139219363128
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658753,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658753
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24220183486238533,
"acc_stderr": 0.018368176306598618,
"acc_norm": 0.24220183486238533,
"acc_norm_stderr": 0.018368176306598618
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.027920963147993666,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.027920963147993666
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.029331162294251735,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.029331162294251735
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2489451476793249,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.2489451476793249,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.38565022421524664,
"acc_stderr": 0.03266842214289201,
"acc_norm": 0.38565022421524664,
"acc_norm_stderr": 0.03266842214289201
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.25190839694656486,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.25190839694656486,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2892561983471074,
"acc_stderr": 0.04139112727635464,
"acc_norm": 0.2892561983471074,
"acc_norm_stderr": 0.04139112727635464
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.043300437496507437,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.043300437496507437
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2392638036809816,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.2392638036809816,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.043270409325787296,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.043270409325787296
},
"harness|hendrycksTest-management|5": {
"acc": 0.2815533980582524,
"acc_stderr": 0.04453254836326469,
"acc_norm": 0.2815533980582524,
"acc_norm_stderr": 0.04453254836326469
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2692307692307692,
"acc_stderr": 0.029058588303748845,
"acc_norm": 0.2692307692307692,
"acc_norm_stderr": 0.029058588303748845
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2771392081736909,
"acc_stderr": 0.01600563629412243,
"acc_norm": 0.2771392081736909,
"acc_norm_stderr": 0.01600563629412243
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24022346368715083,
"acc_stderr": 0.014288343803925308,
"acc_norm": 0.24022346368715083,
"acc_norm_stderr": 0.014288343803925308
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.024848018263875202,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.024848018263875202
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2797427652733119,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.2797427652733119,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2932098765432099,
"acc_stderr": 0.02532988817190092,
"acc_norm": 0.2932098765432099,
"acc_norm_stderr": 0.02532988817190092
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.30851063829787234,
"acc_stderr": 0.02755336616510137,
"acc_norm": 0.30851063829787234,
"acc_norm_stderr": 0.02755336616510137
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24315514993481094,
"acc_stderr": 0.010956556654417344,
"acc_norm": 0.24315514993481094,
"acc_norm_stderr": 0.010956556654417344
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.21691176470588236,
"acc_stderr": 0.02503584522771125,
"acc_norm": 0.21691176470588236,
"acc_norm_stderr": 0.02503584522771125
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.27124183006535946,
"acc_stderr": 0.01798661530403031,
"acc_norm": 0.27124183006535946,
"acc_norm_stderr": 0.01798661530403031
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2909090909090909,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.2909090909090909,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3224489795918367,
"acc_stderr": 0.029923100563683906,
"acc_norm": 0.3224489795918367,
"acc_norm_stderr": 0.029923100563683906
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2736318407960199,
"acc_stderr": 0.031524391865554016,
"acc_norm": 0.2736318407960199,
"acc_norm_stderr": 0.031524391865554016
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3373493975903614,
"acc_stderr": 0.03680783690727581,
"acc_norm": 0.3373493975903614,
"acc_norm_stderr": 0.03680783690727581
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.03508771929824563,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.03508771929824563
},
"harness|truthfulqa:mc|0": {
"mc1": 0.21542227662178703,
"mc1_stderr": 0.014391902652427683,
"mc2": 0.3473974860131068,
"mc2_stderr": 0.013258021799654561
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_codellama__CodeLlama-13b-hf | 2023-08-27T12:42:57.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of codellama/CodeLlama-13b-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [codellama/CodeLlama-13b-hf](https://huggingface.co/codellama/CodeLlama-13b-hf)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_codellama__CodeLlama-13b-hf\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-25T22:41:00.019716](https://huggingface.co/datasets/open-llm-leaderboard/details_codellama__CodeLlama-13b-hf/blob/main/results_2023-08-25T22%3A41%3A00.019716.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.33133305646279027,\n\
\ \"acc_stderr\": 0.033819526019171764,\n \"acc_norm\": 0.3346318411229531,\n\
\ \"acc_norm_stderr\": 0.03381993357374892,\n \"mc1\": 0.2778457772337821,\n\
\ \"mc1_stderr\": 0.01568092936402465,\n \"mc2\": 0.43794269776602796,\n\
\ \"mc2_stderr\": 0.01446900625927817\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3779863481228669,\n \"acc_stderr\": 0.014169664520303101,\n\
\ \"acc_norm\": 0.4087030716723549,\n \"acc_norm_stderr\": 0.014365750345427006\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.469627564230233,\n\
\ \"acc_stderr\": 0.004980566907790459,\n \"acc_norm\": 0.6335391356303525,\n\
\ \"acc_norm_stderr\": 0.0048085268027185865\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.32592592592592595,\n\
\ \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.32592592592592595,\n\
\ \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.29605263157894735,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.29605263157894735,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.27,\n\
\ \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \
\ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.3018867924528302,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.3018867924528302,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.22916666666666666,\n\
\ \"acc_stderr\": 0.035146974678623884,\n \"acc_norm\": 0.22916666666666666,\n\
\ \"acc_norm_stderr\": 0.035146974678623884\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.30057803468208094,\n\
\ \"acc_stderr\": 0.0349610148119118,\n \"acc_norm\": 0.30057803468208094,\n\
\ \"acc_norm_stderr\": 0.0349610148119118\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006718,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006718\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.33617021276595743,\n \"acc_stderr\": 0.030881618520676942,\n\
\ \"acc_norm\": 0.33617021276595743,\n \"acc_norm_stderr\": 0.030881618520676942\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813344,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813344\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.42758620689655175,\n \"acc_stderr\": 0.041227371113703316,\n\
\ \"acc_norm\": 0.42758620689655175,\n \"acc_norm_stderr\": 0.041227371113703316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.26455026455026454,\n \"acc_stderr\": 0.02271746789770861,\n \"\
acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.02271746789770861\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.037184890068181146,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.037184890068181146\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3419354838709677,\n\
\ \"acc_stderr\": 0.026985289576552725,\n \"acc_norm\": 0.3419354838709677,\n\
\ \"acc_norm_stderr\": 0.026985289576552725\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2660098522167488,\n \"acc_stderr\": 0.031089826002937523,\n\
\ \"acc_norm\": 0.2660098522167488,\n \"acc_norm_stderr\": 0.031089826002937523\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\"\
: 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.03477691162163659,\n\
\ \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.03477691162163659\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4696969696969697,\n \"acc_stderr\": 0.03555804051763929,\n \"\
acc_norm\": 0.4696969696969697,\n \"acc_norm_stderr\": 0.03555804051763929\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.47150259067357514,\n \"acc_stderr\": 0.036025735712884414,\n\
\ \"acc_norm\": 0.47150259067357514,\n \"acc_norm_stderr\": 0.036025735712884414\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.34102564102564104,\n \"acc_stderr\": 0.024035489676335065,\n\
\ \"acc_norm\": 0.34102564102564104,\n \"acc_norm_stderr\": 0.024035489676335065\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.02696242432507383,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.02696242432507383\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3445378151260504,\n \"acc_stderr\": 0.030868682604121626,\n\
\ \"acc_norm\": 0.3445378151260504,\n \"acc_norm_stderr\": 0.030868682604121626\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.23841059602649006,\n \"acc_stderr\": 0.0347918557259966,\n \"\
acc_norm\": 0.23841059602649006,\n \"acc_norm_stderr\": 0.0347918557259966\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3192660550458716,\n \"acc_stderr\": 0.019987829069750017,\n \"\
acc_norm\": 0.3192660550458716,\n \"acc_norm_stderr\": 0.019987829069750017\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.034076320938540516,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.034076320938540516\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.3088235294117647,\n \"acc_stderr\": 0.03242661719827218,\n \"\
acc_norm\": 0.3088235294117647,\n \"acc_norm_stderr\": 0.03242661719827218\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.3628691983122363,\n \"acc_stderr\": 0.03129920825530213,\n \
\ \"acc_norm\": 0.3628691983122363,\n \"acc_norm_stderr\": 0.03129920825530213\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3901345291479821,\n\
\ \"acc_stderr\": 0.03273766725459157,\n \"acc_norm\": 0.3901345291479821,\n\
\ \"acc_norm_stderr\": 0.03273766725459157\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.35877862595419846,\n \"acc_stderr\": 0.04206739313864908,\n\
\ \"acc_norm\": 0.35877862595419846,\n \"acc_norm_stderr\": 0.04206739313864908\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.256198347107438,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.256198347107438,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3148148148148148,\n\
\ \"acc_stderr\": 0.04489931073591311,\n \"acc_norm\": 0.3148148148148148,\n\
\ \"acc_norm_stderr\": 0.04489931073591311\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04109974682633932,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04109974682633932\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3883495145631068,\n \"acc_stderr\": 0.04825729337356391,\n\
\ \"acc_norm\": 0.3883495145631068,\n \"acc_norm_stderr\": 0.04825729337356391\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5213675213675214,\n\
\ \"acc_stderr\": 0.032726164476349545,\n \"acc_norm\": 0.5213675213675214,\n\
\ \"acc_norm_stderr\": 0.032726164476349545\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.438058748403576,\n\
\ \"acc_stderr\": 0.017742232238257223,\n \"acc_norm\": 0.438058748403576,\n\
\ \"acc_norm_stderr\": 0.017742232238257223\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.21676300578034682,\n \"acc_stderr\": 0.02218347766841286,\n\
\ \"acc_norm\": 0.21676300578034682,\n \"acc_norm_stderr\": 0.02218347766841286\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2536312849162011,\n\
\ \"acc_stderr\": 0.014551553659369918,\n \"acc_norm\": 0.2536312849162011,\n\
\ \"acc_norm_stderr\": 0.014551553659369918\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.027184498909941613,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.027184498909941613\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3890675241157556,\n\
\ \"acc_stderr\": 0.027690337536485376,\n \"acc_norm\": 0.3890675241157556,\n\
\ \"acc_norm_stderr\": 0.027690337536485376\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.36728395061728397,\n \"acc_stderr\": 0.026822801759507894,\n\
\ \"acc_norm\": 0.36728395061728397,\n \"acc_norm_stderr\": 0.026822801759507894\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2695035460992908,\n \"acc_stderr\": 0.02646903681859063,\n \
\ \"acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.02646903681859063\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2966101694915254,\n\
\ \"acc_stderr\": 0.011665946586082838,\n \"acc_norm\": 0.2966101694915254,\n\
\ \"acc_norm_stderr\": 0.011665946586082838\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.30514705882352944,\n \"acc_stderr\": 0.027971541370170605,\n\
\ \"acc_norm\": 0.30514705882352944,\n \"acc_norm_stderr\": 0.027971541370170605\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.27941176470588236,\n \"acc_stderr\": 0.018152871051538816,\n \
\ \"acc_norm\": 0.27941176470588236,\n \"acc_norm_stderr\": 0.018152871051538816\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.04389311454644286,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.04389311454644286\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3224489795918367,\n \"acc_stderr\": 0.02992310056368391,\n\
\ \"acc_norm\": 0.3224489795918367,\n \"acc_norm_stderr\": 0.02992310056368391\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.417910447761194,\n\
\ \"acc_stderr\": 0.034875586404620636,\n \"acc_norm\": 0.417910447761194,\n\
\ \"acc_norm_stderr\": 0.034875586404620636\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3493975903614458,\n\
\ \"acc_stderr\": 0.0371172519074075,\n \"acc_norm\": 0.3493975903614458,\n\
\ \"acc_norm_stderr\": 0.0371172519074075\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.4327485380116959,\n \"acc_stderr\": 0.03799978644370607,\n\
\ \"acc_norm\": 0.4327485380116959,\n \"acc_norm_stderr\": 0.03799978644370607\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2778457772337821,\n\
\ \"mc1_stderr\": 0.01568092936402465,\n \"mc2\": 0.43794269776602796,\n\
\ \"mc2_stderr\": 0.01446900625927817\n }\n}\n```"
repo_url: https://huggingface.co/codellama/CodeLlama-13b-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|arc:challenge|25_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hellaswag|10_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T22:41:00.019716.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T22:41:00.019716.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T22:41:00.019716.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T22:41:00.019716.parquet'
- config_name: results
data_files:
- split: 2023_08_25T22_41_00.019716
path:
- results_2023-08-25T22:41:00.019716.parquet
- split: latest
path:
- results_2023-08-25T22:41:00.019716.parquet
---
# Dataset Card for Evaluation run of codellama/CodeLlama-13b-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/codellama/CodeLlama-13b-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [codellama/CodeLlama-13b-hf](https://huggingface.co/codellama/CodeLlama-13b-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_codellama__CodeLlama-13b-hf",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-25T22:41:00.019716](https://huggingface.co/datasets/open-llm-leaderboard/details_codellama__CodeLlama-13b-hf/blob/main/results_2023-08-25T22%3A41%3A00.019716.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.33133305646279027,
"acc_stderr": 0.033819526019171764,
"acc_norm": 0.3346318411229531,
"acc_norm_stderr": 0.03381993357374892,
"mc1": 0.2778457772337821,
"mc1_stderr": 0.01568092936402465,
"mc2": 0.43794269776602796,
"mc2_stderr": 0.01446900625927817
},
"harness|arc:challenge|25": {
"acc": 0.3779863481228669,
"acc_stderr": 0.014169664520303101,
"acc_norm": 0.4087030716723549,
"acc_norm_stderr": 0.014365750345427006
},
"harness|hellaswag|10": {
"acc": 0.469627564230233,
"acc_stderr": 0.004980566907790459,
"acc_norm": 0.6335391356303525,
"acc_norm_stderr": 0.0048085268027185865
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.29605263157894735,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.29605263157894735,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3018867924528302,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.3018867924528302,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.22916666666666666,
"acc_stderr": 0.035146974678623884,
"acc_norm": 0.22916666666666666,
"acc_norm_stderr": 0.035146974678623884
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.30057803468208094,
"acc_stderr": 0.0349610148119118,
"acc_norm": 0.30057803468208094,
"acc_norm_stderr": 0.0349610148119118
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006718,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006718
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.33617021276595743,
"acc_stderr": 0.030881618520676942,
"acc_norm": 0.33617021276595743,
"acc_norm_stderr": 0.030881618520676942
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813344,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813344
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.42758620689655175,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.42758620689655175,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.26455026455026454,
"acc_stderr": 0.02271746789770861,
"acc_norm": 0.26455026455026454,
"acc_norm_stderr": 0.02271746789770861
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.037184890068181146,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.037184890068181146
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3419354838709677,
"acc_stderr": 0.026985289576552725,
"acc_norm": 0.3419354838709677,
"acc_norm_stderr": 0.026985289576552725
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2660098522167488,
"acc_stderr": 0.031089826002937523,
"acc_norm": 0.2660098522167488,
"acc_norm_stderr": 0.031089826002937523
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.03477691162163659,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.03477691162163659
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4696969696969697,
"acc_stderr": 0.03555804051763929,
"acc_norm": 0.4696969696969697,
"acc_norm_stderr": 0.03555804051763929
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.47150259067357514,
"acc_stderr": 0.036025735712884414,
"acc_norm": 0.47150259067357514,
"acc_norm_stderr": 0.036025735712884414
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.34102564102564104,
"acc_stderr": 0.024035489676335065,
"acc_norm": 0.34102564102564104,
"acc_norm_stderr": 0.024035489676335065
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.02696242432507383,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.02696242432507383
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3445378151260504,
"acc_stderr": 0.030868682604121626,
"acc_norm": 0.3445378151260504,
"acc_norm_stderr": 0.030868682604121626
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23841059602649006,
"acc_stderr": 0.0347918557259966,
"acc_norm": 0.23841059602649006,
"acc_norm_stderr": 0.0347918557259966
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3192660550458716,
"acc_stderr": 0.019987829069750017,
"acc_norm": 0.3192660550458716,
"acc_norm_stderr": 0.019987829069750017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.034076320938540516,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.034076320938540516
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.3088235294117647,
"acc_stderr": 0.03242661719827218,
"acc_norm": 0.3088235294117647,
"acc_norm_stderr": 0.03242661719827218
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3628691983122363,
"acc_stderr": 0.03129920825530213,
"acc_norm": 0.3628691983122363,
"acc_norm_stderr": 0.03129920825530213
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3901345291479821,
"acc_stderr": 0.03273766725459157,
"acc_norm": 0.3901345291479821,
"acc_norm_stderr": 0.03273766725459157
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.35877862595419846,
"acc_stderr": 0.04206739313864908,
"acc_norm": 0.35877862595419846,
"acc_norm_stderr": 0.04206739313864908
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.256198347107438,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.256198347107438,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.04489931073591311,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.04489931073591311
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25,
"acc_stderr": 0.04109974682633932,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04109974682633932
},
"harness|hendrycksTest-management|5": {
"acc": 0.3883495145631068,
"acc_stderr": 0.04825729337356391,
"acc_norm": 0.3883495145631068,
"acc_norm_stderr": 0.04825729337356391
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5213675213675214,
"acc_stderr": 0.032726164476349545,
"acc_norm": 0.5213675213675214,
"acc_norm_stderr": 0.032726164476349545
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.438058748403576,
"acc_stderr": 0.017742232238257223,
"acc_norm": 0.438058748403576,
"acc_norm_stderr": 0.017742232238257223
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.21676300578034682,
"acc_stderr": 0.02218347766841286,
"acc_norm": 0.21676300578034682,
"acc_norm_stderr": 0.02218347766841286
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2536312849162011,
"acc_stderr": 0.014551553659369918,
"acc_norm": 0.2536312849162011,
"acc_norm_stderr": 0.014551553659369918
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.027184498909941613,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.027184498909941613
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3890675241157556,
"acc_stderr": 0.027690337536485376,
"acc_norm": 0.3890675241157556,
"acc_norm_stderr": 0.027690337536485376
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.36728395061728397,
"acc_stderr": 0.026822801759507894,
"acc_norm": 0.36728395061728397,
"acc_norm_stderr": 0.026822801759507894
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2695035460992908,
"acc_stderr": 0.02646903681859063,
"acc_norm": 0.2695035460992908,
"acc_norm_stderr": 0.02646903681859063
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2966101694915254,
"acc_stderr": 0.011665946586082838,
"acc_norm": 0.2966101694915254,
"acc_norm_stderr": 0.011665946586082838
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.30514705882352944,
"acc_stderr": 0.027971541370170605,
"acc_norm": 0.30514705882352944,
"acc_norm_stderr": 0.027971541370170605
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.27941176470588236,
"acc_stderr": 0.018152871051538816,
"acc_norm": 0.27941176470588236,
"acc_norm_stderr": 0.018152871051538816
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3,
"acc_stderr": 0.04389311454644286,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04389311454644286
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3224489795918367,
"acc_stderr": 0.02992310056368391,
"acc_norm": 0.3224489795918367,
"acc_norm_stderr": 0.02992310056368391
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.417910447761194,
"acc_stderr": 0.034875586404620636,
"acc_norm": 0.417910447761194,
"acc_norm_stderr": 0.034875586404620636
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3493975903614458,
"acc_stderr": 0.0371172519074075,
"acc_norm": 0.3493975903614458,
"acc_norm_stderr": 0.0371172519074075
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.4327485380116959,
"acc_stderr": 0.03799978644370607,
"acc_norm": 0.4327485380116959,
"acc_norm_stderr": 0.03799978644370607
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2778457772337821,
"mc1_stderr": 0.01568092936402465,
"mc2": 0.43794269776602796,
"mc2_stderr": 0.01446900625927817
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_codellama__CodeLlama-34b-Instruct-hf | 2023-08-27T12:42:59.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of codellama/CodeLlama-34b-Instruct-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [codellama/CodeLlama-34b-Instruct-hf](https://huggingface.co/codellama/CodeLlama-34b-Instruct-hf)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_codellama__CodeLlama-34b-Instruct-hf\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-25T23:11:14.511248](https://huggingface.co/datasets/open-llm-leaderboard/details_codellama__CodeLlama-34b-Instruct-hf/blob/main/results_2023-08-25T23%3A11%3A14.511248.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.39529982814936127,\n\
\ \"acc_stderr\": 0.03498378261854782,\n \"acc_norm\": 0.39673912641820325,\n\
\ \"acc_norm_stderr\": 0.03499033569271049,\n \"mc1\": 0.29008567931456547,\n\
\ \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.4428923144531004,\n\
\ \"mc2_stderr\": 0.014810370517699043\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3796928327645051,\n \"acc_stderr\": 0.014182119866974876,\n\
\ \"acc_norm\": 0.40784982935153585,\n \"acc_norm_stderr\": 0.014361097288449708\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2998406691894045,\n\
\ \"acc_stderr\": 0.004572515919210699,\n \"acc_norm\": 0.35660227046405096,\n\
\ \"acc_norm_stderr\": 0.00478016987333286\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.35555555555555557,\n\
\ \"acc_stderr\": 0.04135176749720386,\n \"acc_norm\": 0.35555555555555557,\n\
\ \"acc_norm_stderr\": 0.04135176749720386\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3684210526315789,\n \"acc_stderr\": 0.03925523381052932,\n\
\ \"acc_norm\": 0.3684210526315789,\n \"acc_norm_stderr\": 0.03925523381052932\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.44,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4037735849056604,\n \"acc_stderr\": 0.030197611600197953,\n\
\ \"acc_norm\": 0.4037735849056604,\n \"acc_norm_stderr\": 0.030197611600197953\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3680555555555556,\n\
\ \"acc_stderr\": 0.04032999053960718,\n \"acc_norm\": 0.3680555555555556,\n\
\ \"acc_norm_stderr\": 0.04032999053960718\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3583815028901734,\n\
\ \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.3583815028901734,\n\
\ \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3872340425531915,\n \"acc_stderr\": 0.03184389265339525,\n\
\ \"acc_norm\": 0.3872340425531915,\n \"acc_norm_stderr\": 0.03184389265339525\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.04096985139843672,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.04096985139843672\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3448275862068966,\n \"acc_stderr\": 0.03960933549451208,\n\
\ \"acc_norm\": 0.3448275862068966,\n \"acc_norm_stderr\": 0.03960933549451208\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30423280423280424,\n \"acc_stderr\": 0.02369541500946309,\n \"\
acc_norm\": 0.30423280423280424,\n \"acc_norm_stderr\": 0.02369541500946309\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.44193548387096776,\n \"acc_stderr\": 0.02825155790684974,\n \"\
acc_norm\": 0.44193548387096776,\n \"acc_norm_stderr\": 0.02825155790684974\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3645320197044335,\n \"acc_stderr\": 0.033864057460620905,\n \"\
acc_norm\": 0.3645320197044335,\n \"acc_norm_stderr\": 0.033864057460620905\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.03453131801885415,\n\
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.03453131801885415\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5202020202020202,\n \"acc_stderr\": 0.03559443565563918,\n \"\
acc_norm\": 0.5202020202020202,\n \"acc_norm_stderr\": 0.03559443565563918\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.533678756476684,\n \"acc_stderr\": 0.036002440698671784,\n\
\ \"acc_norm\": 0.533678756476684,\n \"acc_norm_stderr\": 0.036002440698671784\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3564102564102564,\n \"acc_stderr\": 0.024283140529467295,\n\
\ \"acc_norm\": 0.3564102564102564,\n \"acc_norm_stderr\": 0.024283140529467295\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683515,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683515\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.39915966386554624,\n \"acc_stderr\": 0.031811100324139245,\n\
\ \"acc_norm\": 0.39915966386554624,\n \"acc_norm_stderr\": 0.031811100324139245\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.45321100917431195,\n \"acc_stderr\": 0.021343255165546037,\n \"\
acc_norm\": 0.45321100917431195,\n \"acc_norm_stderr\": 0.021343255165546037\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.24074074074074073,\n \"acc_stderr\": 0.029157522184605596,\n \"\
acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.029157522184605596\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.28431372549019607,\n \"acc_stderr\": 0.031660096793998116,\n \"\
acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.031660096793998116\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.4092827004219409,\n \"acc_stderr\": 0.032007041833595914,\n \
\ \"acc_norm\": 0.4092827004219409,\n \"acc_norm_stderr\": 0.032007041833595914\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4260089686098655,\n\
\ \"acc_stderr\": 0.0331883328621728,\n \"acc_norm\": 0.4260089686098655,\n\
\ \"acc_norm_stderr\": 0.0331883328621728\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4351145038167939,\n \"acc_stderr\": 0.04348208051644858,\n\
\ \"acc_norm\": 0.4351145038167939,\n \"acc_norm_stderr\": 0.04348208051644858\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5702479338842975,\n \"acc_stderr\": 0.04519082021319773,\n \"\
acc_norm\": 0.5702479338842975,\n \"acc_norm_stderr\": 0.04519082021319773\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5370370370370371,\n\
\ \"acc_stderr\": 0.04820403072760628,\n \"acc_norm\": 0.5370370370370371,\n\
\ \"acc_norm_stderr\": 0.04820403072760628\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3803680981595092,\n \"acc_stderr\": 0.03814269893261837,\n\
\ \"acc_norm\": 0.3803680981595092,\n \"acc_norm_stderr\": 0.03814269893261837\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.26785714285714285,\n\
\ \"acc_stderr\": 0.04203277291467762,\n \"acc_norm\": 0.26785714285714285,\n\
\ \"acc_norm_stderr\": 0.04203277291467762\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5436893203883495,\n \"acc_stderr\": 0.049318019942204146,\n\
\ \"acc_norm\": 0.5436893203883495,\n \"acc_norm_stderr\": 0.049318019942204146\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6196581196581197,\n\
\ \"acc_stderr\": 0.03180425204384099,\n \"acc_norm\": 0.6196581196581197,\n\
\ \"acc_norm_stderr\": 0.03180425204384099\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-miscellaneous|5\"\
: {\n \"acc\": 0.565772669220945,\n \"acc_stderr\": 0.017724589389677785,\n\
\ \"acc_norm\": 0.565772669220945,\n \"acc_norm_stderr\": 0.017724589389677785\n\
\ },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.41040462427745666,\n\
\ \"acc_stderr\": 0.02648339204209818,\n \"acc_norm\": 0.41040462427745666,\n\
\ \"acc_norm_stderr\": 0.02648339204209818\n },\n \"harness|hendrycksTest-moral_scenarios|5\"\
: {\n \"acc\": 0.20446927374301677,\n \"acc_stderr\": 0.013488813404711917,\n\
\ \"acc_norm\": 0.20446927374301677,\n \"acc_norm_stderr\": 0.013488813404711917\n\
\ },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.4117647058823529,\n\
\ \"acc_stderr\": 0.02818059632825929,\n \"acc_norm\": 0.4117647058823529,\n\
\ \"acc_norm_stderr\": 0.02818059632825929\n },\n \"harness|hendrycksTest-philosophy|5\"\
: {\n \"acc\": 0.5048231511254019,\n \"acc_stderr\": 0.028396770444111298,\n\
\ \"acc_norm\": 0.5048231511254019,\n \"acc_norm_stderr\": 0.028396770444111298\n\
\ },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.4567901234567901,\n\
\ \"acc_stderr\": 0.02771666165019404,\n \"acc_norm\": 0.4567901234567901,\n\
\ \"acc_norm_stderr\": 0.02771666165019404\n },\n \"harness|hendrycksTest-professional_accounting|5\"\
: {\n \"acc\": 0.31560283687943264,\n \"acc_stderr\": 0.027724989449509314,\n\
\ \"acc_norm\": 0.31560283687943264,\n \"acc_norm_stderr\": 0.027724989449509314\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.27835723598435463,\n\
\ \"acc_stderr\": 0.011446990197380985,\n \"acc_norm\": 0.27835723598435463,\n\
\ \"acc_norm_stderr\": 0.011446990197380985\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3272058823529412,\n \"acc_stderr\": 0.028501452860396563,\n\
\ \"acc_norm\": 0.3272058823529412,\n \"acc_norm_stderr\": 0.028501452860396563\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3431372549019608,\n \"acc_stderr\": 0.019206606848825365,\n \
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.019206606848825365\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n\
\ \"acc_stderr\": 0.04782001791380061,\n \"acc_norm\": 0.5272727272727272,\n\
\ \"acc_norm_stderr\": 0.04782001791380061\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.0289205832206756,\n\
\ \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.0289205832206756\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.48258706467661694,\n\
\ \"acc_stderr\": 0.03533389234739244,\n \"acc_norm\": 0.48258706467661694,\n\
\ \"acc_norm_stderr\": 0.03533389234739244\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4036144578313253,\n\
\ \"acc_stderr\": 0.03819486140758398,\n \"acc_norm\": 0.4036144578313253,\n\
\ \"acc_norm_stderr\": 0.03819486140758398\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6432748538011696,\n \"acc_stderr\": 0.03674013002860954,\n\
\ \"acc_norm\": 0.6432748538011696,\n \"acc_norm_stderr\": 0.03674013002860954\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29008567931456547,\n\
\ \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.4428923144531004,\n\
\ \"mc2_stderr\": 0.014810370517699043\n }\n}\n```"
repo_url: https://huggingface.co/codellama/CodeLlama-34b-Instruct-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|arc:challenge|25_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hellaswag|10_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T23:11:14.511248.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T23:11:14.511248.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T23:11:14.511248.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T23:11:14.511248.parquet'
- config_name: results
data_files:
- split: 2023_08_25T23_11_14.511248
path:
- results_2023-08-25T23:11:14.511248.parquet
- split: latest
path:
- results_2023-08-25T23:11:14.511248.parquet
---
# Dataset Card for Evaluation run of codellama/CodeLlama-34b-Instruct-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/codellama/CodeLlama-34b-Instruct-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [codellama/CodeLlama-34b-Instruct-hf](https://huggingface.co/codellama/CodeLlama-34b-Instruct-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_codellama__CodeLlama-34b-Instruct-hf",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-25T23:11:14.511248](https://huggingface.co/datasets/open-llm-leaderboard/details_codellama__CodeLlama-34b-Instruct-hf/blob/main/results_2023-08-25T23%3A11%3A14.511248.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.39529982814936127,
"acc_stderr": 0.03498378261854782,
"acc_norm": 0.39673912641820325,
"acc_norm_stderr": 0.03499033569271049,
"mc1": 0.29008567931456547,
"mc1_stderr": 0.01588623687420952,
"mc2": 0.4428923144531004,
"mc2_stderr": 0.014810370517699043
},
"harness|arc:challenge|25": {
"acc": 0.3796928327645051,
"acc_stderr": 0.014182119866974876,
"acc_norm": 0.40784982935153585,
"acc_norm_stderr": 0.014361097288449708
},
"harness|hellaswag|10": {
"acc": 0.2998406691894045,
"acc_stderr": 0.004572515919210699,
"acc_norm": 0.35660227046405096,
"acc_norm_stderr": 0.00478016987333286
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.04135176749720386,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.04135176749720386
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4037735849056604,
"acc_stderr": 0.030197611600197953,
"acc_norm": 0.4037735849056604,
"acc_norm_stderr": 0.030197611600197953
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3680555555555556,
"acc_stderr": 0.04032999053960718,
"acc_norm": 0.3680555555555556,
"acc_norm_stderr": 0.04032999053960718
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3583815028901734,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.3583815028901734,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3872340425531915,
"acc_stderr": 0.03184389265339525,
"acc_norm": 0.3872340425531915,
"acc_norm_stderr": 0.03184389265339525
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.04096985139843672,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.04096985139843672
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3448275862068966,
"acc_stderr": 0.03960933549451208,
"acc_norm": 0.3448275862068966,
"acc_norm_stderr": 0.03960933549451208
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30423280423280424,
"acc_stderr": 0.02369541500946309,
"acc_norm": 0.30423280423280424,
"acc_norm_stderr": 0.02369541500946309
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.44193548387096776,
"acc_stderr": 0.02825155790684974,
"acc_norm": 0.44193548387096776,
"acc_norm_stderr": 0.02825155790684974
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3645320197044335,
"acc_stderr": 0.033864057460620905,
"acc_norm": 0.3645320197044335,
"acc_norm_stderr": 0.033864057460620905
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.03453131801885415,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.03453131801885415
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5202020202020202,
"acc_stderr": 0.03559443565563918,
"acc_norm": 0.5202020202020202,
"acc_norm_stderr": 0.03559443565563918
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.533678756476684,
"acc_stderr": 0.036002440698671784,
"acc_norm": 0.533678756476684,
"acc_norm_stderr": 0.036002440698671784
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3564102564102564,
"acc_stderr": 0.024283140529467295,
"acc_norm": 0.3564102564102564,
"acc_norm_stderr": 0.024283140529467295
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683515,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683515
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.39915966386554624,
"acc_stderr": 0.031811100324139245,
"acc_norm": 0.39915966386554624,
"acc_norm_stderr": 0.031811100324139245
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.45321100917431195,
"acc_stderr": 0.021343255165546037,
"acc_norm": 0.45321100917431195,
"acc_norm_stderr": 0.021343255165546037
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.029157522184605596,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.029157522184605596
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.031660096793998116,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.031660096793998116
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4092827004219409,
"acc_stderr": 0.032007041833595914,
"acc_norm": 0.4092827004219409,
"acc_norm_stderr": 0.032007041833595914
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4260089686098655,
"acc_stderr": 0.0331883328621728,
"acc_norm": 0.4260089686098655,
"acc_norm_stderr": 0.0331883328621728
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4351145038167939,
"acc_stderr": 0.04348208051644858,
"acc_norm": 0.4351145038167939,
"acc_norm_stderr": 0.04348208051644858
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5702479338842975,
"acc_stderr": 0.04519082021319773,
"acc_norm": 0.5702479338842975,
"acc_norm_stderr": 0.04519082021319773
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.04820403072760628,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.04820403072760628
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3803680981595092,
"acc_stderr": 0.03814269893261837,
"acc_norm": 0.3803680981595092,
"acc_norm_stderr": 0.03814269893261837
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.26785714285714285,
"acc_stderr": 0.04203277291467762,
"acc_norm": 0.26785714285714285,
"acc_norm_stderr": 0.04203277291467762
},
"harness|hendrycksTest-management|5": {
"acc": 0.5436893203883495,
"acc_stderr": 0.049318019942204146,
"acc_norm": 0.5436893203883495,
"acc_norm_stderr": 0.049318019942204146
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6196581196581197,
"acc_stderr": 0.03180425204384099,
"acc_norm": 0.6196581196581197,
"acc_norm_stderr": 0.03180425204384099
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.565772669220945,
"acc_stderr": 0.017724589389677785,
"acc_norm": 0.565772669220945,
"acc_norm_stderr": 0.017724589389677785
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.41040462427745666,
"acc_stderr": 0.02648339204209818,
"acc_norm": 0.41040462427745666,
"acc_norm_stderr": 0.02648339204209818
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.20446927374301677,
"acc_stderr": 0.013488813404711917,
"acc_norm": 0.20446927374301677,
"acc_norm_stderr": 0.013488813404711917
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.02818059632825929,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.02818059632825929
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5048231511254019,
"acc_stderr": 0.028396770444111298,
"acc_norm": 0.5048231511254019,
"acc_norm_stderr": 0.028396770444111298
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4567901234567901,
"acc_stderr": 0.02771666165019404,
"acc_norm": 0.4567901234567901,
"acc_norm_stderr": 0.02771666165019404
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.31560283687943264,
"acc_stderr": 0.027724989449509314,
"acc_norm": 0.31560283687943264,
"acc_norm_stderr": 0.027724989449509314
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.27835723598435463,
"acc_stderr": 0.011446990197380985,
"acc_norm": 0.27835723598435463,
"acc_norm_stderr": 0.011446990197380985
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3272058823529412,
"acc_stderr": 0.028501452860396563,
"acc_norm": 0.3272058823529412,
"acc_norm_stderr": 0.028501452860396563
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.019206606848825365,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.019206606848825365
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5272727272727272,
"acc_stderr": 0.04782001791380061,
"acc_norm": 0.5272727272727272,
"acc_norm_stderr": 0.04782001791380061
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0289205832206756,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0289205832206756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.48258706467661694,
"acc_stderr": 0.03533389234739244,
"acc_norm": 0.48258706467661694,
"acc_norm_stderr": 0.03533389234739244
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4036144578313253,
"acc_stderr": 0.03819486140758398,
"acc_norm": 0.4036144578313253,
"acc_norm_stderr": 0.03819486140758398
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6432748538011696,
"acc_stderr": 0.03674013002860954,
"acc_norm": 0.6432748538011696,
"acc_norm_stderr": 0.03674013002860954
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29008567931456547,
"mc1_stderr": 0.01588623687420952,
"mc2": 0.4428923144531004,
"mc2_stderr": 0.014810370517699043
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_TheBloke__CodeLlama-13B-Instruct-fp16 | 2023-08-27T12:43:01.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TheBloke/CodeLlama-13B-Instruct-fp16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/CodeLlama-13B-Instruct-fp16](https://huggingface.co/TheBloke/CodeLlama-13B-Instruct-fp16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__CodeLlama-13B-Instruct-fp16\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-25T23:11:55.664382](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__CodeLlama-13B-Instruct-fp16/blob/main/results_2023-08-25T23%3A11%3A55.664382.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3896359912218762,\n\
\ \"acc_stderr\": 0.03514263009984807,\n \"acc_norm\": 0.39312135846415236,\n\
\ \"acc_norm_stderr\": 0.03514155527381711,\n \"mc1\": 0.2937576499388005,\n\
\ \"mc1_stderr\": 0.015945068581236614,\n \"mc2\": 0.45878663529563757,\n\
\ \"mc2_stderr\": 0.014860043549181953\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4087030716723549,\n \"acc_stderr\": 0.014365750345427006,\n\
\ \"acc_norm\": 0.4462457337883959,\n \"acc_norm_stderr\": 0.014526705548539982\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4812786297550289,\n\
\ \"acc_stderr\": 0.004986282450647317,\n \"acc_norm\": 0.6493726349332802,\n\
\ \"acc_norm_stderr\": 0.004761912511707506\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.32592592592592595,\n\
\ \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.32592592592592595,\n\
\ \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4144736842105263,\n \"acc_stderr\": 0.04008973785779206,\n\
\ \"acc_norm\": 0.4144736842105263,\n \"acc_norm_stderr\": 0.04008973785779206\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.37,\n\
\ \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \
\ \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.38113207547169814,\n \"acc_stderr\": 0.029890609686286637,\n\
\ \"acc_norm\": 0.38113207547169814,\n \"acc_norm_stderr\": 0.029890609686286637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3611111111111111,\n\
\ \"acc_stderr\": 0.04016660030451233,\n \"acc_norm\": 0.3611111111111111,\n\
\ \"acc_norm_stderr\": 0.04016660030451233\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3236994219653179,\n\
\ \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.3236994219653179,\n\
\ \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.046550104113196177,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.046550104113196177\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3617021276595745,\n \"acc_stderr\": 0.031410821975962414,\n\
\ \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.031410821975962414\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4206896551724138,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.4206896551724138,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2751322751322751,\n \"acc_stderr\": 0.02300008685906866,\n \"\
acc_norm\": 0.2751322751322751,\n \"acc_norm_stderr\": 0.02300008685906866\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.040735243221471255,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.040735243221471255\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.4064516129032258,\n \"acc_stderr\": 0.0279417273462563,\n \"acc_norm\"\
: 0.4064516129032258,\n \"acc_norm_stderr\": 0.0279417273462563\n },\n\
\ \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3054187192118227,\n\
\ \"acc_stderr\": 0.03240661565868408,\n \"acc_norm\": 0.3054187192118227,\n\
\ \"acc_norm_stderr\": 0.03240661565868408\n },\n \"harness|hendrycksTest-high_school_computer_science|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"\
acc\": 0.3575757575757576,\n \"acc_stderr\": 0.03742597043806585,\n \
\ \"acc_norm\": 0.3575757575757576,\n \"acc_norm_stderr\": 0.03742597043806585\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5151515151515151,\n \"acc_stderr\": 0.03560716516531061,\n \"\
acc_norm\": 0.5151515151515151,\n \"acc_norm_stderr\": 0.03560716516531061\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5077720207253886,\n \"acc_stderr\": 0.03608003225569654,\n\
\ \"acc_norm\": 0.5077720207253886,\n \"acc_norm_stderr\": 0.03608003225569654\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.36923076923076925,\n \"acc_stderr\": 0.02446861524147892,\n\
\ \"acc_norm\": 0.36923076923076925,\n \"acc_norm_stderr\": 0.02446861524147892\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2222222222222222,\n \"acc_stderr\": 0.025348097468097863,\n \
\ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.025348097468097863\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.37815126050420167,\n \"acc_stderr\": 0.03149930577784906,\n\
\ \"acc_norm\": 0.37815126050420167,\n \"acc_norm_stderr\": 0.03149930577784906\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.4990825688073395,\n \"acc_stderr\": 0.021437287056051215,\n \"\
acc_norm\": 0.4990825688073395,\n \"acc_norm_stderr\": 0.021437287056051215\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.37962962962962965,\n \"acc_stderr\": 0.03309682581119035,\n \"\
acc_norm\": 0.37962962962962965,\n \"acc_norm_stderr\": 0.03309682581119035\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.38235294117647056,\n \"acc_stderr\": 0.03410785338904719,\n \"\
acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.03410785338904719\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.38396624472573837,\n \"acc_stderr\": 0.031658678064106674,\n \
\ \"acc_norm\": 0.38396624472573837,\n \"acc_norm_stderr\": 0.031658678064106674\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4260089686098655,\n\
\ \"acc_stderr\": 0.033188332862172806,\n \"acc_norm\": 0.4260089686098655,\n\
\ \"acc_norm_stderr\": 0.033188332862172806\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4198473282442748,\n \"acc_stderr\": 0.04328577215262972,\n\
\ \"acc_norm\": 0.4198473282442748,\n \"acc_norm_stderr\": 0.04328577215262972\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.38016528925619836,\n \"acc_stderr\": 0.04431324501968431,\n \"\
acc_norm\": 0.38016528925619836,\n \"acc_norm_stderr\": 0.04431324501968431\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4537037037037037,\n\
\ \"acc_stderr\": 0.04812917324536823,\n \"acc_norm\": 0.4537037037037037,\n\
\ \"acc_norm_stderr\": 0.04812917324536823\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4049079754601227,\n \"acc_stderr\": 0.038566721635489125,\n\
\ \"acc_norm\": 0.4049079754601227,\n \"acc_norm_stderr\": 0.038566721635489125\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.4563106796116505,\n \"acc_stderr\": 0.049318019942204146,\n\
\ \"acc_norm\": 0.4563106796116505,\n \"acc_norm_stderr\": 0.049318019942204146\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7051282051282052,\n\
\ \"acc_stderr\": 0.02987257770889117,\n \"acc_norm\": 0.7051282051282052,\n\
\ \"acc_norm_stderr\": 0.02987257770889117\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562427,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562427\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4955300127713921,\n\
\ \"acc_stderr\": 0.017879248970584377,\n \"acc_norm\": 0.4955300127713921,\n\
\ \"acc_norm_stderr\": 0.017879248970584377\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.3468208092485549,\n \"acc_stderr\": 0.025624723994030457,\n\
\ \"acc_norm\": 0.3468208092485549,\n \"acc_norm_stderr\": 0.025624723994030457\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2681564245810056,\n\
\ \"acc_stderr\": 0.014816119635317003,\n \"acc_norm\": 0.2681564245810056,\n\
\ \"acc_norm_stderr\": 0.014816119635317003\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.027956046165424516,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.027956046165424516\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.40836012861736337,\n\
\ \"acc_stderr\": 0.027917050748484634,\n \"acc_norm\": 0.40836012861736337,\n\
\ \"acc_norm_stderr\": 0.027917050748484634\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.02712511551316686,\n\
\ \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.02712511551316686\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3049645390070922,\n \"acc_stderr\": 0.027464708442022128,\n \
\ \"acc_norm\": 0.3049645390070922,\n \"acc_norm_stderr\": 0.027464708442022128\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2926988265971317,\n\
\ \"acc_stderr\": 0.011620949195849528,\n \"acc_norm\": 0.2926988265971317,\n\
\ \"acc_norm_stderr\": 0.011620949195849528\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.34558823529411764,\n \"acc_stderr\": 0.028888193103988647,\n\
\ \"acc_norm\": 0.34558823529411764,\n \"acc_norm_stderr\": 0.028888193103988647\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3284313725490196,\n \"acc_stderr\": 0.01899970738316267,\n \
\ \"acc_norm\": 0.3284313725490196,\n \"acc_norm_stderr\": 0.01899970738316267\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4727272727272727,\n\
\ \"acc_stderr\": 0.04782001791380063,\n \"acc_norm\": 0.4727272727272727,\n\
\ \"acc_norm_stderr\": 0.04782001791380063\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4326530612244898,\n \"acc_stderr\": 0.031717528240626645,\n\
\ \"acc_norm\": 0.4326530612244898,\n \"acc_norm_stderr\": 0.031717528240626645\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.4577114427860697,\n\
\ \"acc_stderr\": 0.035228658640995975,\n \"acc_norm\": 0.4577114427860697,\n\
\ \"acc_norm_stderr\": 0.035228658640995975\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39759036144578314,\n\
\ \"acc_stderr\": 0.038099730845402184,\n \"acc_norm\": 0.39759036144578314,\n\
\ \"acc_norm_stderr\": 0.038099730845402184\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.4678362573099415,\n \"acc_stderr\": 0.038268824176603676,\n\
\ \"acc_norm\": 0.4678362573099415,\n \"acc_norm_stderr\": 0.038268824176603676\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2937576499388005,\n\
\ \"mc1_stderr\": 0.015945068581236614,\n \"mc2\": 0.45878663529563757,\n\
\ \"mc2_stderr\": 0.014860043549181953\n }\n}\n```"
repo_url: https://huggingface.co/TheBloke/CodeLlama-13B-Instruct-fp16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|arc:challenge|25_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hellaswag|10_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T23:11:55.664382.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T23:11:55.664382.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T23:11:55.664382.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T23:11:55.664382.parquet'
- config_name: results
data_files:
- split: 2023_08_25T23_11_55.664382
path:
- results_2023-08-25T23:11:55.664382.parquet
- split: latest
path:
- results_2023-08-25T23:11:55.664382.parquet
---
# Dataset Card for Evaluation run of TheBloke/CodeLlama-13B-Instruct-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/CodeLlama-13B-Instruct-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/CodeLlama-13B-Instruct-fp16](https://huggingface.co/TheBloke/CodeLlama-13B-Instruct-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__CodeLlama-13B-Instruct-fp16",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-25T23:11:55.664382](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__CodeLlama-13B-Instruct-fp16/blob/main/results_2023-08-25T23%3A11%3A55.664382.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3896359912218762,
"acc_stderr": 0.03514263009984807,
"acc_norm": 0.39312135846415236,
"acc_norm_stderr": 0.03514155527381711,
"mc1": 0.2937576499388005,
"mc1_stderr": 0.015945068581236614,
"mc2": 0.45878663529563757,
"mc2_stderr": 0.014860043549181953
},
"harness|arc:challenge|25": {
"acc": 0.4087030716723549,
"acc_stderr": 0.014365750345427006,
"acc_norm": 0.4462457337883959,
"acc_norm_stderr": 0.014526705548539982
},
"harness|hellaswag|10": {
"acc": 0.4812786297550289,
"acc_stderr": 0.004986282450647317,
"acc_norm": 0.6493726349332802,
"acc_norm_stderr": 0.004761912511707506
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4144736842105263,
"acc_stderr": 0.04008973785779206,
"acc_norm": 0.4144736842105263,
"acc_norm_stderr": 0.04008973785779206
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.38113207547169814,
"acc_stderr": 0.029890609686286637,
"acc_norm": 0.38113207547169814,
"acc_norm_stderr": 0.029890609686286637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3611111111111111,
"acc_stderr": 0.04016660030451233,
"acc_norm": 0.3611111111111111,
"acc_norm_stderr": 0.04016660030451233
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3236994219653179,
"acc_stderr": 0.0356760379963917,
"acc_norm": 0.3236994219653179,
"acc_norm_stderr": 0.0356760379963917
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.046550104113196177,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.046550104113196177
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3617021276595745,
"acc_stderr": 0.031410821975962414,
"acc_norm": 0.3617021276595745,
"acc_norm_stderr": 0.031410821975962414
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4206896551724138,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.4206896551724138,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2751322751322751,
"acc_stderr": 0.02300008685906866,
"acc_norm": 0.2751322751322751,
"acc_norm_stderr": 0.02300008685906866
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.040735243221471255,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.040735243221471255
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4064516129032258,
"acc_stderr": 0.0279417273462563,
"acc_norm": 0.4064516129032258,
"acc_norm_stderr": 0.0279417273462563
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3054187192118227,
"acc_stderr": 0.03240661565868408,
"acc_norm": 0.3054187192118227,
"acc_norm_stderr": 0.03240661565868408
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3575757575757576,
"acc_stderr": 0.03742597043806585,
"acc_norm": 0.3575757575757576,
"acc_norm_stderr": 0.03742597043806585
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5151515151515151,
"acc_stderr": 0.03560716516531061,
"acc_norm": 0.5151515151515151,
"acc_norm_stderr": 0.03560716516531061
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5077720207253886,
"acc_stderr": 0.03608003225569654,
"acc_norm": 0.5077720207253886,
"acc_norm_stderr": 0.03608003225569654
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.36923076923076925,
"acc_stderr": 0.02446861524147892,
"acc_norm": 0.36923076923076925,
"acc_norm_stderr": 0.02446861524147892
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.025348097468097863,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.025348097468097863
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.37815126050420167,
"acc_stderr": 0.03149930577784906,
"acc_norm": 0.37815126050420167,
"acc_norm_stderr": 0.03149930577784906
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.4990825688073395,
"acc_stderr": 0.021437287056051215,
"acc_norm": 0.4990825688073395,
"acc_norm_stderr": 0.021437287056051215
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.37962962962962965,
"acc_stderr": 0.03309682581119035,
"acc_norm": 0.37962962962962965,
"acc_norm_stderr": 0.03309682581119035
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.03410785338904719,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.03410785338904719
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.38396624472573837,
"acc_stderr": 0.031658678064106674,
"acc_norm": 0.38396624472573837,
"acc_norm_stderr": 0.031658678064106674
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4260089686098655,
"acc_stderr": 0.033188332862172806,
"acc_norm": 0.4260089686098655,
"acc_norm_stderr": 0.033188332862172806
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4198473282442748,
"acc_stderr": 0.04328577215262972,
"acc_norm": 0.4198473282442748,
"acc_norm_stderr": 0.04328577215262972
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.38016528925619836,
"acc_stderr": 0.04431324501968431,
"acc_norm": 0.38016528925619836,
"acc_norm_stderr": 0.04431324501968431
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.04812917324536823,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.04812917324536823
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4049079754601227,
"acc_stderr": 0.038566721635489125,
"acc_norm": 0.4049079754601227,
"acc_norm_stderr": 0.038566721635489125
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.4563106796116505,
"acc_stderr": 0.049318019942204146,
"acc_norm": 0.4563106796116505,
"acc_norm_stderr": 0.049318019942204146
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7051282051282052,
"acc_stderr": 0.02987257770889117,
"acc_norm": 0.7051282051282052,
"acc_norm_stderr": 0.02987257770889117
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562427,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562427
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.4955300127713921,
"acc_stderr": 0.017879248970584377,
"acc_norm": 0.4955300127713921,
"acc_norm_stderr": 0.017879248970584377
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3468208092485549,
"acc_stderr": 0.025624723994030457,
"acc_norm": 0.3468208092485549,
"acc_norm_stderr": 0.025624723994030457
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2681564245810056,
"acc_stderr": 0.014816119635317003,
"acc_norm": 0.2681564245810056,
"acc_norm_stderr": 0.014816119635317003
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.027956046165424516,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.027956046165424516
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.40836012861736337,
"acc_stderr": 0.027917050748484634,
"acc_norm": 0.40836012861736337,
"acc_norm_stderr": 0.027917050748484634
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.02712511551316686,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.02712511551316686
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3049645390070922,
"acc_stderr": 0.027464708442022128,
"acc_norm": 0.3049645390070922,
"acc_norm_stderr": 0.027464708442022128
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2926988265971317,
"acc_stderr": 0.011620949195849528,
"acc_norm": 0.2926988265971317,
"acc_norm_stderr": 0.011620949195849528
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.34558823529411764,
"acc_stderr": 0.028888193103988647,
"acc_norm": 0.34558823529411764,
"acc_norm_stderr": 0.028888193103988647
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3284313725490196,
"acc_stderr": 0.01899970738316267,
"acc_norm": 0.3284313725490196,
"acc_norm_stderr": 0.01899970738316267
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4727272727272727,
"acc_stderr": 0.04782001791380063,
"acc_norm": 0.4727272727272727,
"acc_norm_stderr": 0.04782001791380063
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4326530612244898,
"acc_stderr": 0.031717528240626645,
"acc_norm": 0.4326530612244898,
"acc_norm_stderr": 0.031717528240626645
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.4577114427860697,
"acc_stderr": 0.035228658640995975,
"acc_norm": 0.4577114427860697,
"acc_norm_stderr": 0.035228658640995975
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39759036144578314,
"acc_stderr": 0.038099730845402184,
"acc_norm": 0.39759036144578314,
"acc_norm_stderr": 0.038099730845402184
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.4678362573099415,
"acc_stderr": 0.038268824176603676,
"acc_norm": 0.4678362573099415,
"acc_norm_stderr": 0.038268824176603676
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2937576499388005,
"mc1_stderr": 0.015945068581236614,
"mc2": 0.45878663529563757,
"mc2_stderr": 0.014860043549181953
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
nyntany/REPO | 2023-09-18T18:53:09.000Z | [
"license:openrail",
"region:us"
] | nyntany | null | null | null | 0 | 0 | ---
license: openrail
---
|
sendai-test1/test1 | 2023-08-25T23:40:26.000Z | [
"region:us"
] | sendai-test1 | null | null | null | 0 | 0 | Entry not found |
P1ayer-1/isbndb | 2023-08-26T00:02:38.000Z | [
"region:us"
] | P1ayer-1 | null | null | null | 0 | 0 | Entry not found |
ioakslman24/ioakslman24 | 2023-08-25T23:43:16.000Z | [
"region:us"
] | ioakslman24 | null | null | null | 0 | 0 | Entry not found |
vbasgyat23/vbasgyat23 | 2023-08-25T23:45:07.000Z | [
"region:us"
] | vbasgyat23 | null | null | null | 0 | 0 | Entry not found |
Kawai868/Kawai868 | 2023-08-25T23:47:22.000Z | [
"region:us"
] | Kawai868 | null | null | null | 0 | 0 | Entry not found |
Takinata634/Takinata634 | 2023-08-25T23:47:26.000Z | [
"region:us"
] | Takinata634 | null | null | null | 0 | 0 | Entry not found |
Watari8459/Watari8459 | 2023-08-25T23:50:07.000Z | [
"region:us"
] | Watari8459 | null | null | null | 0 | 0 | Entry not found |
Noring453/Noring453 | 2023-08-25T23:50:10.000Z | [
"region:us"
] | Noring453 | null | null | null | 0 | 0 | Entry not found |
Umeki8934/Umeki8934 | 2023-08-25T23:51:47.000Z | [
"region:us"
] | Umeki8934 | null | null | null | 0 | 0 | Entry not found |
Inoue453/Inoue453 | 2023-08-25T23:52:28.000Z | [
"region:us"
] | Inoue453 | null | null | null | 0 | 0 | Entry not found |
Komuro893/Komuro893 | 2023-08-25T23:54:06.000Z | [
"region:us"
] | Komuro893 | null | null | null | 0 | 0 | Entry not found |
danjacobellis/imagenet_RDAE | 2023-08-29T14:57:42.000Z | [
"arxiv:1511.06281",
"region:us"
] | danjacobellis | null | null | null | 0 | 0 | # imagenet-RDAE
This dataset consists of the [ImageNet-1k Dataset (original size 216 GB)](https://huggingface.co/datasets/imagenet-1k) compressed to 7GB using a [Rate-Distortion Autoencoder](https://www.cns.nyu.edu/~lcv/iclr2017/).
## Using the RDAE
The rate-distortion autoencoder was trained on [Vimeo90k](http://toflow.csail.mit.edu/). The [training script is available here](https://github.com/danjacobellis/moco/blob/main/train_RDAE_on_vimeo.ipynb).
An example showing how to use the pretrained model for both encoding and decoding is provided below.
```python
import os
import zlib
import requests
import numpy as np
import PIL.Image as Image
from IPython.display import display
import torch
import torch.nn as nn
from torchvision import transforms
import compressai
from compressai.entropy_models import EntropyBottleneck
from compressai.layers import GDN
from compressai.models import CompressionModel
from compressai.models.utils import conv, deconv
```
### Get an example image
```python
r = requests.get("https://r0k.us/graphics/kodak/kodak/kodim05.png", stream=True)
img = Image.open(r.raw); img
```

The autoencoder expects images to be scaled to the range $[-1.0,1.0]$.
```python
def pil_to_pt(img):
t = transforms.functional.pil_to_tensor(img)
t = t.to(torch.float)
t = t/255
t = t-0.5
t = t.unsqueeze(0)
return t
def pt_to_pil(t):
t = t+0.5
t = t*255
t = torch.clamp(t, min=-0.49, max=255.49)
t = t.round()
t = t.to(torch.uint8)
return t
```
The autoencoder uses three convolutional layers and [generalized divisive normalization](https://arxiv.org/abs/1511.06281).
```python
class Network(CompressionModel):
def __init__(self, N=128):
super().__init__()
self.entropy_bottleneck = EntropyBottleneck(N)
self.encode = nn.Sequential(
conv(3, N),
GDN(N),
conv(N, N),
GDN(N),
conv(N, N),
)
self.decode = nn.Sequential(
deconv(N, N),
GDN(N, inverse=True),
deconv(N, N),
GDN(N, inverse=True),
deconv(N, 3),
)
def forward(self, x):
y = self.encode(x)
y_hat, y_likelihoods = self.entropy_bottleneck(y)
x_hat = self.decode(y_hat)
return x_hat, y_likelihoods
```
Compression occurs in two stages: A lossy analysis transform and lossless entropy coding. The python zlib module is used for entropy coding.
```python
def lossy_analysis_transform(img):
x = pil_to_pt(img).to("cuda")
z = net.encode(x).round().to(torch.int8).detach().to("cpu").numpy()
return z
def lossless_entropy_encode(z):
original_shape = z.shape
compressed_img = zlib.compress(z.tobytes(), level=9)
return compressed_img, original_shape
def compress(img):
z = lossy_analysis_transform(img)
compressed_img, original_shape = lossless_entropy_encode(z)
return compressed_img, original_shape
```
```python
def entropy_decoder(compressed_img,original_shape):
decompressed = zlib.decompress(compressed_img)
ẑ = np.frombuffer(decompressed, dtype=np.int8)
ẑ = ẑ.reshape(original_shape)
return ẑ
def synthesis_transform(ẑ):
ẑ = torch.tensor(ẑ).to("cuda").to(torch.float)
x̂ = net.decode(ẑ).detach().to("cpu")
return x̂
def decompress(compressed_img, original_shape):
ẑ = entropy_decoder(compressed_img,original_shape)
x̂ = synthesis_transform(ẑ)
return x̂
```
Load the pretrained network.
```python
net = Network()
net = net.to("cuda")
checkpoint = torch.load("checkpoint.pth")
net.load_state_dict(checkpoint['model_state_dict'])
```
<All keys matched successfully>
Save a copy using JPEG for comparison.
```python
img.save("kodim05.jpg", "JPEG", quality=5)
jpeg = Image.open("kodim05.jpg");
```
Compress the and decompress the image.
```python
compressed_img, original_shape = compress(img);
print("Bytes in compressed image:", len(compressed_img))
x̂ = decompress(compressed_img, original_shape)
```
Bytes in compressed image: 14391
Compare the quality of the compressed image to JPEG and the original
```python
print('original image');
display(img)
print("JPEG compression ratio", (3*512*768)/os.path.getsize('kodim05.jpg'))
display(jpeg);
print("RDAE compression ratio", (3*512*768)/len(compressed_img))
display(transforms.ToPILImage()(pt_to_pil(x̂)[0]))
```
original image

JPEG compression ratio 77.25771170345143

RDAE compression ratio 81.97123202001251

View the distribution of codes produced by the RDAE.
```python
import matplotlib.pyplot as plt
z = lossy_analysis_transform(img)
histogram = torch.histogram(torch.tensor(z).to(torch.float),bins=33, range=(-16.5,16.5))
x = np.convolve(histogram.bin_edges,[0.5,0.5])[1:-1]
y = np.log(1+histogram.hist)
plt.bar(x,y);
```

|
Isaki342/Isaki342 | 2023-08-25T23:55:17.000Z | [
"region:us"
] | Isaki342 | null | null | null | 0 | 0 | Entry not found |
Narisawa8393/Narisawa8393 | 2023-08-25T23:56:55.000Z | [
"region:us"
] | Narisawa8393 | null | null | null | 0 | 0 | Entry not found |
saidancok83/Ayano344 | 2023-08-25T23:58:34.000Z | [
"region:us"
] | saidancok83 | null | null | null | 0 | 0 | Entry not found |
Ayano344/Ayano344 | 2023-08-25T23:58:46.000Z | [
"region:us"
] | Ayano344 | null | null | null | 0 | 0 | Entry not found |
Anno453/Anno453 | 2023-08-26T00:00:39.000Z | [
"region:us"
] | Anno453 | null | null | null | 0 | 0 | Entry not found |
Asuka532/Asuka532 | 2023-08-26T00:00:43.000Z | [
"region:us"
] | Asuka532 | null | null | null | 0 | 0 | Entry not found |
Manaka553/Manaka553 | 2023-08-26T00:03:52.000Z | [
"region:us"
] | Manaka553 | null | null | null | 0 | 0 | Entry not found |
Yoshiko454/Yoshiko454 | 2023-08-26T00:03:56.000Z | [
"region:us"
] | Yoshiko454 | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_NousResearch__CodeLlama-7b-hf | 2023-09-17T04:22:24.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of NousResearch/CodeLlama-7b-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NousResearch/CodeLlama-7b-hf](https://huggingface.co/NousResearch/CodeLlama-7b-hf)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NousResearch__CodeLlama-7b-hf\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T04:22:12.772861](https://huggingface.co/datasets/open-llm-leaderboard/details_NousResearch__CodeLlama-7b-hf/blob/main/results_2023-09-17T04-22-12.772861.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0005243288590604027,\n\
\ \"em_stderr\": 0.00023443780464835895,\n \"f1\": 0.05166212248322184,\n\
\ \"f1_stderr\": 0.0012470290169941962,\n \"acc\": 0.3516817229574676,\n\
\ \"acc_stderr\": 0.00983671270422883\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0005243288590604027,\n \"em_stderr\": 0.00023443780464835895,\n\
\ \"f1\": 0.05166212248322184,\n \"f1_stderr\": 0.0012470290169941962\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.05458680818802123,\n \
\ \"acc_stderr\": 0.006257444037912531\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.648776637726914,\n \"acc_stderr\": 0.013415981370545131\n\
\ }\n}\n```"
repo_url: https://huggingface.co/NousResearch/CodeLlama-7b-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|arc:challenge|25_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T04_22_12.772861
path:
- '**/details_harness|drop|3_2023-09-17T04-22-12.772861.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T04-22-12.772861.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T04_22_12.772861
path:
- '**/details_harness|gsm8k|5_2023-09-17T04-22-12.772861.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T04-22-12.772861.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hellaswag|10_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T00:03:47.670325.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T00:03:47.670325.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T00:03:47.670325.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T04_22_12.772861
path:
- '**/details_harness|winogrande|5_2023-09-17T04-22-12.772861.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T04-22-12.772861.parquet'
- config_name: results
data_files:
- split: 2023_08_26T00_03_47.670325
path:
- results_2023-08-26T00:03:47.670325.parquet
- split: 2023_09_17T04_22_12.772861
path:
- results_2023-09-17T04-22-12.772861.parquet
- split: latest
path:
- results_2023-09-17T04-22-12.772861.parquet
---
# Dataset Card for Evaluation run of NousResearch/CodeLlama-7b-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/NousResearch/CodeLlama-7b-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [NousResearch/CodeLlama-7b-hf](https://huggingface.co/NousResearch/CodeLlama-7b-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NousResearch__CodeLlama-7b-hf",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T04:22:12.772861](https://huggingface.co/datasets/open-llm-leaderboard/details_NousResearch__CodeLlama-7b-hf/blob/main/results_2023-09-17T04-22-12.772861.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0005243288590604027,
"em_stderr": 0.00023443780464835895,
"f1": 0.05166212248322184,
"f1_stderr": 0.0012470290169941962,
"acc": 0.3516817229574676,
"acc_stderr": 0.00983671270422883
},
"harness|drop|3": {
"em": 0.0005243288590604027,
"em_stderr": 0.00023443780464835895,
"f1": 0.05166212248322184,
"f1_stderr": 0.0012470290169941962
},
"harness|gsm8k|5": {
"acc": 0.05458680818802123,
"acc_stderr": 0.006257444037912531
},
"harness|winogrande|5": {
"acc": 0.648776637726914,
"acc_stderr": 0.013415981370545131
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_yeontaek__llama-2-13B-ensemble-v3 | 2023-08-27T12:43:04.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of yeontaek/llama-2-13B-ensemble-v3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yeontaek/llama-2-13B-ensemble-v3](https://huggingface.co/yeontaek/llama-2-13B-ensemble-v3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yeontaek__llama-2-13B-ensemble-v3\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-26T00:04:59.687493](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-13B-ensemble-v3/blob/main/results_2023-08-26T00%3A04%3A59.687493.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5779190308106052,\n\
\ \"acc_stderr\": 0.03411621449047892,\n \"acc_norm\": 0.5817150329939268,\n\
\ \"acc_norm_stderr\": 0.03409598894931763,\n \"mc1\": 0.35495716034271724,\n\
\ \"mc1_stderr\": 0.0167508623813759,\n \"mc2\": 0.49782296764839973,\n\
\ \"mc2_stderr\": 0.015206569782538341\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5981228668941979,\n \"acc_stderr\": 0.014327268614578274,\n\
\ \"acc_norm\": 0.6237201365187713,\n \"acc_norm_stderr\": 0.014157022555407161\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6245767775343557,\n\
\ \"acc_stderr\": 0.004832423630593182,\n \"acc_norm\": 0.8229436367257519,\n\
\ \"acc_norm_stderr\": 0.0038093627612481094\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\
\ \"acc_stderr\": 0.04304979692464243,\n \"acc_norm\": 0.45925925925925926,\n\
\ \"acc_norm_stderr\": 0.04304979692464243\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.04063302731486671,\n\
\ \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.04063302731486671\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6113207547169811,\n \"acc_stderr\": 0.030000485448675986,\n\
\ \"acc_norm\": 0.6113207547169811,\n \"acc_norm_stderr\": 0.030000485448675986\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6180555555555556,\n\
\ \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.6180555555555556,\n\
\ \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.46,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5317919075144508,\n\
\ \"acc_stderr\": 0.038047497443647646,\n \"acc_norm\": 0.5317919075144508,\n\
\ \"acc_norm_stderr\": 0.038047497443647646\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.046550104113196177,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.046550104113196177\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.46382978723404256,\n \"acc_stderr\": 0.032600385118357715,\n\
\ \"acc_norm\": 0.46382978723404256,\n \"acc_norm_stderr\": 0.032600385118357715\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192118,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192118\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3386243386243386,\n \"acc_stderr\": 0.02437319786798307,\n \"\
acc_norm\": 0.3386243386243386,\n \"acc_norm_stderr\": 0.02437319786798307\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.667741935483871,\n\
\ \"acc_stderr\": 0.0267955608481228,\n \"acc_norm\": 0.667741935483871,\n\
\ \"acc_norm_stderr\": 0.0267955608481228\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.43349753694581283,\n \"acc_stderr\": 0.03486731727419872,\n\
\ \"acc_norm\": 0.43349753694581283,\n \"acc_norm_stderr\": 0.03486731727419872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.703030303030303,\n \"acc_stderr\": 0.03567969772268049,\n\
\ \"acc_norm\": 0.703030303030303,\n \"acc_norm_stderr\": 0.03567969772268049\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365897,\n \"\
acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365897\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.02749350424454806,\n\
\ \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.02749350424454806\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5717948717948718,\n \"acc_stderr\": 0.025088301454694838,\n\
\ \"acc_norm\": 0.5717948717948718,\n \"acc_norm_stderr\": 0.025088301454694838\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.28888888888888886,\n \"acc_stderr\": 0.02763490726417854,\n \
\ \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.02763490726417854\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5966386554621849,\n \"acc_stderr\": 0.031866081214088314,\n\
\ \"acc_norm\": 0.5966386554621849,\n \"acc_norm_stderr\": 0.031866081214088314\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7651376146788991,\n \"acc_stderr\": 0.018175110510343567,\n \"\
acc_norm\": 0.7651376146788991,\n \"acc_norm_stderr\": 0.018175110510343567\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502326,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502326\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7549019607843137,\n \"acc_stderr\": 0.030190282453501947,\n \"\
acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.030190282453501947\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n \
\ \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6793893129770993,\n \"acc_stderr\": 0.04093329229834278,\n\
\ \"acc_norm\": 0.6793893129770993,\n \"acc_norm_stderr\": 0.04093329229834278\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8290598290598291,\n\
\ \"acc_stderr\": 0.02466249684520982,\n \"acc_norm\": 0.8290598290598291,\n\
\ \"acc_norm_stderr\": 0.02466249684520982\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7739463601532567,\n\
\ \"acc_stderr\": 0.014957458504335837,\n \"acc_norm\": 0.7739463601532567,\n\
\ \"acc_norm_stderr\": 0.014957458504335837\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6445086705202312,\n \"acc_stderr\": 0.025770292082977257,\n\
\ \"acc_norm\": 0.6445086705202312,\n \"acc_norm_stderr\": 0.025770292082977257\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4324022346368715,\n\
\ \"acc_stderr\": 0.01656897123354861,\n \"acc_norm\": 0.4324022346368715,\n\
\ \"acc_norm_stderr\": 0.01656897123354861\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6372549019607843,\n \"acc_stderr\": 0.027530078447110307,\n\
\ \"acc_norm\": 0.6372549019607843,\n \"acc_norm_stderr\": 0.027530078447110307\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6495176848874598,\n\
\ \"acc_stderr\": 0.027098652621301754,\n \"acc_norm\": 0.6495176848874598,\n\
\ \"acc_norm_stderr\": 0.027098652621301754\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6327160493827161,\n \"acc_stderr\": 0.026822801759507894,\n\
\ \"acc_norm\": 0.6327160493827161,\n \"acc_norm_stderr\": 0.026822801759507894\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.42907801418439717,\n \"acc_stderr\": 0.02952591430255856,\n \
\ \"acc_norm\": 0.42907801418439717,\n \"acc_norm_stderr\": 0.02952591430255856\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41395045632333766,\n\
\ \"acc_stderr\": 0.012579699631289262,\n \"acc_norm\": 0.41395045632333766,\n\
\ \"acc_norm_stderr\": 0.012579699631289262\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.03032024326500413,\n\
\ \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.03032024326500413\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5833333333333334,\n \"acc_stderr\": 0.01994491413687358,\n \
\ \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.01994491413687358\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6448979591836734,\n \"acc_stderr\": 0.030635655150387638,\n\
\ \"acc_norm\": 0.6448979591836734,\n \"acc_norm_stderr\": 0.030635655150387638\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7263681592039801,\n\
\ \"acc_stderr\": 0.03152439186555401,\n \"acc_norm\": 0.7263681592039801,\n\
\ \"acc_norm_stderr\": 0.03152439186555401\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35495716034271724,\n\
\ \"mc1_stderr\": 0.0167508623813759,\n \"mc2\": 0.49782296764839973,\n\
\ \"mc2_stderr\": 0.015206569782538341\n }\n}\n```"
repo_url: https://huggingface.co/yeontaek/llama-2-13B-ensemble-v3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|arc:challenge|25_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hellaswag|10_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T00:04:59.687493.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T00:04:59.687493.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T00:04:59.687493.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T00:04:59.687493.parquet'
- config_name: results
data_files:
- split: 2023_08_26T00_04_59.687493
path:
- results_2023-08-26T00:04:59.687493.parquet
- split: latest
path:
- results_2023-08-26T00:04:59.687493.parquet
---
# Dataset Card for Evaluation run of yeontaek/llama-2-13B-ensemble-v3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/yeontaek/llama-2-13B-ensemble-v3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [yeontaek/llama-2-13B-ensemble-v3](https://huggingface.co/yeontaek/llama-2-13B-ensemble-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yeontaek__llama-2-13B-ensemble-v3",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-26T00:04:59.687493](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-13B-ensemble-v3/blob/main/results_2023-08-26T00%3A04%3A59.687493.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5779190308106052,
"acc_stderr": 0.03411621449047892,
"acc_norm": 0.5817150329939268,
"acc_norm_stderr": 0.03409598894931763,
"mc1": 0.35495716034271724,
"mc1_stderr": 0.0167508623813759,
"mc2": 0.49782296764839973,
"mc2_stderr": 0.015206569782538341
},
"harness|arc:challenge|25": {
"acc": 0.5981228668941979,
"acc_stderr": 0.014327268614578274,
"acc_norm": 0.6237201365187713,
"acc_norm_stderr": 0.014157022555407161
},
"harness|hellaswag|10": {
"acc": 0.6245767775343557,
"acc_stderr": 0.004832423630593182,
"acc_norm": 0.8229436367257519,
"acc_norm_stderr": 0.0038093627612481094
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464243,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464243
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.04063302731486671,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.04063302731486671
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6113207547169811,
"acc_stderr": 0.030000485448675986,
"acc_norm": 0.6113207547169811,
"acc_norm_stderr": 0.030000485448675986
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6180555555555556,
"acc_stderr": 0.040629907841466674,
"acc_norm": 0.6180555555555556,
"acc_norm_stderr": 0.040629907841466674
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5317919075144508,
"acc_stderr": 0.038047497443647646,
"acc_norm": 0.5317919075144508,
"acc_norm_stderr": 0.038047497443647646
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.046550104113196177,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.046550104113196177
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46382978723404256,
"acc_stderr": 0.032600385118357715,
"acc_norm": 0.46382978723404256,
"acc_norm_stderr": 0.032600385118357715
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192118,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192118
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3386243386243386,
"acc_stderr": 0.02437319786798307,
"acc_norm": 0.3386243386243386,
"acc_norm_stderr": 0.02437319786798307
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.667741935483871,
"acc_stderr": 0.0267955608481228,
"acc_norm": 0.667741935483871,
"acc_norm_stderr": 0.0267955608481228
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43349753694581283,
"acc_stderr": 0.03486731727419872,
"acc_norm": 0.43349753694581283,
"acc_norm_stderr": 0.03486731727419872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.703030303030303,
"acc_stderr": 0.03567969772268049,
"acc_norm": 0.703030303030303,
"acc_norm_stderr": 0.03567969772268049
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.030954055470365897,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.030954055470365897
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.02749350424454806,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.02749350424454806
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5717948717948718,
"acc_stderr": 0.025088301454694838,
"acc_norm": 0.5717948717948718,
"acc_norm_stderr": 0.025088301454694838
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.02763490726417854,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.02763490726417854
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5966386554621849,
"acc_stderr": 0.031866081214088314,
"acc_norm": 0.5966386554621849,
"acc_norm_stderr": 0.031866081214088314
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7651376146788991,
"acc_stderr": 0.018175110510343567,
"acc_norm": 0.7651376146788991,
"acc_norm_stderr": 0.018175110510343567
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03388857118502326,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03388857118502326
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.030190282453501947,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.030190282453501947
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.027985699387036423,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.027985699387036423
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6793893129770993,
"acc_stderr": 0.04093329229834278,
"acc_norm": 0.6793893129770993,
"acc_norm_stderr": 0.04093329229834278
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8290598290598291,
"acc_stderr": 0.02466249684520982,
"acc_norm": 0.8290598290598291,
"acc_norm_stderr": 0.02466249684520982
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7739463601532567,
"acc_stderr": 0.014957458504335837,
"acc_norm": 0.7739463601532567,
"acc_norm_stderr": 0.014957458504335837
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6445086705202312,
"acc_stderr": 0.025770292082977257,
"acc_norm": 0.6445086705202312,
"acc_norm_stderr": 0.025770292082977257
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4324022346368715,
"acc_stderr": 0.01656897123354861,
"acc_norm": 0.4324022346368715,
"acc_norm_stderr": 0.01656897123354861
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6372549019607843,
"acc_stderr": 0.027530078447110307,
"acc_norm": 0.6372549019607843,
"acc_norm_stderr": 0.027530078447110307
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6495176848874598,
"acc_stderr": 0.027098652621301754,
"acc_norm": 0.6495176848874598,
"acc_norm_stderr": 0.027098652621301754
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6327160493827161,
"acc_stderr": 0.026822801759507894,
"acc_norm": 0.6327160493827161,
"acc_norm_stderr": 0.026822801759507894
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.42907801418439717,
"acc_stderr": 0.02952591430255856,
"acc_norm": 0.42907801418439717,
"acc_norm_stderr": 0.02952591430255856
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41395045632333766,
"acc_stderr": 0.012579699631289262,
"acc_norm": 0.41395045632333766,
"acc_norm_stderr": 0.012579699631289262
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.03032024326500413,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.03032024326500413
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.01994491413687358,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.01994491413687358
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6448979591836734,
"acc_stderr": 0.030635655150387638,
"acc_norm": 0.6448979591836734,
"acc_norm_stderr": 0.030635655150387638
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7263681592039801,
"acc_stderr": 0.03152439186555401,
"acc_norm": 0.7263681592039801,
"acc_norm_stderr": 0.03152439186555401
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890593,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890593
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.35495716034271724,
"mc1_stderr": 0.0167508623813759,
"mc2": 0.49782296764839973,
"mc2_stderr": 0.015206569782538341
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Kotake734/Kotake734 | 2023-08-26T00:05:40.000Z | [
"region:us"
] | Kotake734 | null | null | null | 0 | 0 | Entry not found |
Takinge45/Takinge45 | 2023-08-26T00:06:05.000Z | [
"region:us"
] | Takinge45 | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_ehartford__CodeLlama-34b-Instruct-hf | 2023-08-27T12:43:06.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of ehartford/CodeLlama-34b-Instruct-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ehartford/CodeLlama-34b-Instruct-hf](https://huggingface.co/ehartford/CodeLlama-34b-Instruct-hf)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ehartford__CodeLlama-34b-Instruct-hf\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-26T00:11:17.332215](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__CodeLlama-34b-Instruct-hf/blob/main/results_2023-08-26T00%3A11%3A17.332215.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3954825543560614,\n\
\ \"acc_stderr\": 0.034996131407759465,\n \"acc_norm\": 0.39693969001192136,\n\
\ \"acc_norm_stderr\": 0.03500279971831286,\n \"mc1\": 0.29008567931456547,\n\
\ \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.4428923144531004,\n\
\ \"mc2_stderr\": 0.014810370517699043\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.378839590443686,\n \"acc_stderr\": 0.01417591549000032,\n\
\ \"acc_norm\": 0.40784982935153585,\n \"acc_norm_stderr\": 0.014361097288449708\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2998406691894045,\n\
\ \"acc_stderr\": 0.004572515919210699,\n \"acc_norm\": 0.35680143397729536,\n\
\ \"acc_norm_stderr\": 0.004780764443411313\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.35555555555555557,\n\
\ \"acc_stderr\": 0.04135176749720386,\n \"acc_norm\": 0.35555555555555557,\n\
\ \"acc_norm_stderr\": 0.04135176749720386\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3684210526315789,\n \"acc_stderr\": 0.03925523381052932,\n\
\ \"acc_norm\": 0.3684210526315789,\n \"acc_norm_stderr\": 0.03925523381052932\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.44,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4037735849056604,\n \"acc_stderr\": 0.030197611600197953,\n\
\ \"acc_norm\": 0.4037735849056604,\n \"acc_norm_stderr\": 0.030197611600197953\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3680555555555556,\n\
\ \"acc_stderr\": 0.04032999053960718,\n \"acc_norm\": 0.3680555555555556,\n\
\ \"acc_norm_stderr\": 0.04032999053960718\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3583815028901734,\n\
\ \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.3583815028901734,\n\
\ \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3872340425531915,\n \"acc_stderr\": 0.03184389265339525,\n\
\ \"acc_norm\": 0.3872340425531915,\n \"acc_norm_stderr\": 0.03184389265339525\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.04096985139843672,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.04096985139843672\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3448275862068966,\n \"acc_stderr\": 0.03960933549451208,\n\
\ \"acc_norm\": 0.3448275862068966,\n \"acc_norm_stderr\": 0.03960933549451208\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30423280423280424,\n \"acc_stderr\": 0.02369541500946309,\n \"\
acc_norm\": 0.30423280423280424,\n \"acc_norm_stderr\": 0.02369541500946309\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.44193548387096776,\n \"acc_stderr\": 0.02825155790684974,\n \"\
acc_norm\": 0.44193548387096776,\n \"acc_norm_stderr\": 0.02825155790684974\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3645320197044335,\n \"acc_stderr\": 0.033864057460620905,\n \"\
acc_norm\": 0.3645320197044335,\n \"acc_norm_stderr\": 0.033864057460620905\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.28484848484848485,\n \"acc_stderr\": 0.035243908445117836,\n\
\ \"acc_norm\": 0.28484848484848485,\n \"acc_norm_stderr\": 0.035243908445117836\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5202020202020202,\n \"acc_stderr\": 0.03559443565563918,\n \"\
acc_norm\": 0.5202020202020202,\n \"acc_norm_stderr\": 0.03559443565563918\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.533678756476684,\n \"acc_stderr\": 0.036002440698671784,\n\
\ \"acc_norm\": 0.533678756476684,\n \"acc_norm_stderr\": 0.036002440698671784\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3564102564102564,\n \"acc_stderr\": 0.024283140529467295,\n\
\ \"acc_norm\": 0.3564102564102564,\n \"acc_norm_stderr\": 0.024283140529467295\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683515,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683515\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.39915966386554624,\n \"acc_stderr\": 0.031811100324139245,\n\
\ \"acc_norm\": 0.39915966386554624,\n \"acc_norm_stderr\": 0.031811100324139245\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.45321100917431195,\n \"acc_stderr\": 0.021343255165546037,\n \"\
acc_norm\": 0.45321100917431195,\n \"acc_norm_stderr\": 0.021343255165546037\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.24074074074074073,\n \"acc_stderr\": 0.029157522184605596,\n \"\
acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.029157522184605596\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.28431372549019607,\n \"acc_stderr\": 0.03166009679399812,\n \"\
acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.03166009679399812\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.4008438818565401,\n \"acc_stderr\": 0.03190080389473236,\n \
\ \"acc_norm\": 0.4008438818565401,\n \"acc_norm_stderr\": 0.03190080389473236\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4260089686098655,\n\
\ \"acc_stderr\": 0.0331883328621728,\n \"acc_norm\": 0.4260089686098655,\n\
\ \"acc_norm_stderr\": 0.0331883328621728\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4351145038167939,\n \"acc_stderr\": 0.04348208051644858,\n\
\ \"acc_norm\": 0.4351145038167939,\n \"acc_norm_stderr\": 0.04348208051644858\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5702479338842975,\n \"acc_stderr\": 0.04519082021319773,\n \"\
acc_norm\": 0.5702479338842975,\n \"acc_norm_stderr\": 0.04519082021319773\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5370370370370371,\n\
\ \"acc_stderr\": 0.04820403072760628,\n \"acc_norm\": 0.5370370370370371,\n\
\ \"acc_norm_stderr\": 0.04820403072760628\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3803680981595092,\n \"acc_stderr\": 0.03814269893261837,\n\
\ \"acc_norm\": 0.3803680981595092,\n \"acc_norm_stderr\": 0.03814269893261837\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.26785714285714285,\n\
\ \"acc_stderr\": 0.04203277291467762,\n \"acc_norm\": 0.26785714285714285,\n\
\ \"acc_norm_stderr\": 0.04203277291467762\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5436893203883495,\n \"acc_stderr\": 0.049318019942204146,\n\
\ \"acc_norm\": 0.5436893203883495,\n \"acc_norm_stderr\": 0.049318019942204146\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6196581196581197,\n\
\ \"acc_stderr\": 0.03180425204384099,\n \"acc_norm\": 0.6196581196581197,\n\
\ \"acc_norm_stderr\": 0.03180425204384099\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-miscellaneous|5\"\
: {\n \"acc\": 0.565772669220945,\n \"acc_stderr\": 0.017724589389677785,\n\
\ \"acc_norm\": 0.565772669220945,\n \"acc_norm_stderr\": 0.017724589389677785\n\
\ },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.41040462427745666,\n\
\ \"acc_stderr\": 0.02648339204209818,\n \"acc_norm\": 0.41040462427745666,\n\
\ \"acc_norm_stderr\": 0.02648339204209818\n },\n \"harness|hendrycksTest-moral_scenarios|5\"\
: {\n \"acc\": 0.20446927374301677,\n \"acc_stderr\": 0.013488813404711917,\n\
\ \"acc_norm\": 0.20446927374301677,\n \"acc_norm_stderr\": 0.013488813404711917\n\
\ },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.4117647058823529,\n\
\ \"acc_stderr\": 0.02818059632825929,\n \"acc_norm\": 0.4117647058823529,\n\
\ \"acc_norm_stderr\": 0.02818059632825929\n },\n \"harness|hendrycksTest-philosophy|5\"\
: {\n \"acc\": 0.5048231511254019,\n \"acc_stderr\": 0.028396770444111298,\n\
\ \"acc_norm\": 0.5048231511254019,\n \"acc_norm_stderr\": 0.028396770444111298\n\
\ },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.4567901234567901,\n\
\ \"acc_stderr\": 0.02771666165019404,\n \"acc_norm\": 0.4567901234567901,\n\
\ \"acc_norm_stderr\": 0.02771666165019404\n },\n \"harness|hendrycksTest-professional_accounting|5\"\
: {\n \"acc\": 0.31560283687943264,\n \"acc_stderr\": 0.027724989449509314,\n\
\ \"acc_norm\": 0.31560283687943264,\n \"acc_norm_stderr\": 0.027724989449509314\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.27249022164276404,\n\
\ \"acc_stderr\": 0.011371658294311514,\n \"acc_norm\": 0.27249022164276404,\n\
\ \"acc_norm_stderr\": 0.011371658294311514\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.33088235294117646,\n \"acc_stderr\": 0.02858270975389844,\n\
\ \"acc_norm\": 0.33088235294117646,\n \"acc_norm_stderr\": 0.02858270975389844\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3431372549019608,\n \"acc_stderr\": 0.019206606848825365,\n \
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.019206606848825365\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n\
\ \"acc_stderr\": 0.04782001791380061,\n \"acc_norm\": 0.5272727272727272,\n\
\ \"acc_norm_stderr\": 0.04782001791380061\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2897959183673469,\n \"acc_stderr\": 0.02904308868330432,\n\
\ \"acc_norm\": 0.2897959183673469,\n \"acc_norm_stderr\": 0.02904308868330432\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.48258706467661694,\n\
\ \"acc_stderr\": 0.03533389234739244,\n \"acc_norm\": 0.48258706467661694,\n\
\ \"acc_norm_stderr\": 0.03533389234739244\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4036144578313253,\n\
\ \"acc_stderr\": 0.03819486140758398,\n \"acc_norm\": 0.4036144578313253,\n\
\ \"acc_norm_stderr\": 0.03819486140758398\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6432748538011696,\n \"acc_stderr\": 0.03674013002860954,\n\
\ \"acc_norm\": 0.6432748538011696,\n \"acc_norm_stderr\": 0.03674013002860954\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29008567931456547,\n\
\ \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.4428923144531004,\n\
\ \"mc2_stderr\": 0.014810370517699043\n }\n}\n```"
repo_url: https://huggingface.co/ehartford/CodeLlama-34b-Instruct-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|arc:challenge|25_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hellaswag|10_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T00:11:17.332215.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T00:11:17.332215.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T00:11:17.332215.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T00:11:17.332215.parquet'
- config_name: results
data_files:
- split: 2023_08_26T00_11_17.332215
path:
- results_2023-08-26T00:11:17.332215.parquet
- split: latest
path:
- results_2023-08-26T00:11:17.332215.parquet
---
# Dataset Card for Evaluation run of ehartford/CodeLlama-34b-Instruct-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ehartford/CodeLlama-34b-Instruct-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ehartford/CodeLlama-34b-Instruct-hf](https://huggingface.co/ehartford/CodeLlama-34b-Instruct-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ehartford__CodeLlama-34b-Instruct-hf",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-26T00:11:17.332215](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__CodeLlama-34b-Instruct-hf/blob/main/results_2023-08-26T00%3A11%3A17.332215.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3954825543560614,
"acc_stderr": 0.034996131407759465,
"acc_norm": 0.39693969001192136,
"acc_norm_stderr": 0.03500279971831286,
"mc1": 0.29008567931456547,
"mc1_stderr": 0.01588623687420952,
"mc2": 0.4428923144531004,
"mc2_stderr": 0.014810370517699043
},
"harness|arc:challenge|25": {
"acc": 0.378839590443686,
"acc_stderr": 0.01417591549000032,
"acc_norm": 0.40784982935153585,
"acc_norm_stderr": 0.014361097288449708
},
"harness|hellaswag|10": {
"acc": 0.2998406691894045,
"acc_stderr": 0.004572515919210699,
"acc_norm": 0.35680143397729536,
"acc_norm_stderr": 0.004780764443411313
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.04135176749720386,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.04135176749720386
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4037735849056604,
"acc_stderr": 0.030197611600197953,
"acc_norm": 0.4037735849056604,
"acc_norm_stderr": 0.030197611600197953
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3680555555555556,
"acc_stderr": 0.04032999053960718,
"acc_norm": 0.3680555555555556,
"acc_norm_stderr": 0.04032999053960718
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3583815028901734,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.3583815028901734,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3872340425531915,
"acc_stderr": 0.03184389265339525,
"acc_norm": 0.3872340425531915,
"acc_norm_stderr": 0.03184389265339525
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.04096985139843672,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.04096985139843672
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3448275862068966,
"acc_stderr": 0.03960933549451208,
"acc_norm": 0.3448275862068966,
"acc_norm_stderr": 0.03960933549451208
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30423280423280424,
"acc_stderr": 0.02369541500946309,
"acc_norm": 0.30423280423280424,
"acc_norm_stderr": 0.02369541500946309
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.44193548387096776,
"acc_stderr": 0.02825155790684974,
"acc_norm": 0.44193548387096776,
"acc_norm_stderr": 0.02825155790684974
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3645320197044335,
"acc_stderr": 0.033864057460620905,
"acc_norm": 0.3645320197044335,
"acc_norm_stderr": 0.033864057460620905
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.28484848484848485,
"acc_stderr": 0.035243908445117836,
"acc_norm": 0.28484848484848485,
"acc_norm_stderr": 0.035243908445117836
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5202020202020202,
"acc_stderr": 0.03559443565563918,
"acc_norm": 0.5202020202020202,
"acc_norm_stderr": 0.03559443565563918
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.533678756476684,
"acc_stderr": 0.036002440698671784,
"acc_norm": 0.533678756476684,
"acc_norm_stderr": 0.036002440698671784
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3564102564102564,
"acc_stderr": 0.024283140529467295,
"acc_norm": 0.3564102564102564,
"acc_norm_stderr": 0.024283140529467295
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683515,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683515
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.39915966386554624,
"acc_stderr": 0.031811100324139245,
"acc_norm": 0.39915966386554624,
"acc_norm_stderr": 0.031811100324139245
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.45321100917431195,
"acc_stderr": 0.021343255165546037,
"acc_norm": 0.45321100917431195,
"acc_norm_stderr": 0.021343255165546037
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.029157522184605596,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.029157522184605596
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.03166009679399812,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.03166009679399812
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4008438818565401,
"acc_stderr": 0.03190080389473236,
"acc_norm": 0.4008438818565401,
"acc_norm_stderr": 0.03190080389473236
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4260089686098655,
"acc_stderr": 0.0331883328621728,
"acc_norm": 0.4260089686098655,
"acc_norm_stderr": 0.0331883328621728
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4351145038167939,
"acc_stderr": 0.04348208051644858,
"acc_norm": 0.4351145038167939,
"acc_norm_stderr": 0.04348208051644858
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5702479338842975,
"acc_stderr": 0.04519082021319773,
"acc_norm": 0.5702479338842975,
"acc_norm_stderr": 0.04519082021319773
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.04820403072760628,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.04820403072760628
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3803680981595092,
"acc_stderr": 0.03814269893261837,
"acc_norm": 0.3803680981595092,
"acc_norm_stderr": 0.03814269893261837
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.26785714285714285,
"acc_stderr": 0.04203277291467762,
"acc_norm": 0.26785714285714285,
"acc_norm_stderr": 0.04203277291467762
},
"harness|hendrycksTest-management|5": {
"acc": 0.5436893203883495,
"acc_stderr": 0.049318019942204146,
"acc_norm": 0.5436893203883495,
"acc_norm_stderr": 0.049318019942204146
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6196581196581197,
"acc_stderr": 0.03180425204384099,
"acc_norm": 0.6196581196581197,
"acc_norm_stderr": 0.03180425204384099
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.565772669220945,
"acc_stderr": 0.017724589389677785,
"acc_norm": 0.565772669220945,
"acc_norm_stderr": 0.017724589389677785
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.41040462427745666,
"acc_stderr": 0.02648339204209818,
"acc_norm": 0.41040462427745666,
"acc_norm_stderr": 0.02648339204209818
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.20446927374301677,
"acc_stderr": 0.013488813404711917,
"acc_norm": 0.20446927374301677,
"acc_norm_stderr": 0.013488813404711917
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.02818059632825929,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.02818059632825929
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5048231511254019,
"acc_stderr": 0.028396770444111298,
"acc_norm": 0.5048231511254019,
"acc_norm_stderr": 0.028396770444111298
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4567901234567901,
"acc_stderr": 0.02771666165019404,
"acc_norm": 0.4567901234567901,
"acc_norm_stderr": 0.02771666165019404
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.31560283687943264,
"acc_stderr": 0.027724989449509314,
"acc_norm": 0.31560283687943264,
"acc_norm_stderr": 0.027724989449509314
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.27249022164276404,
"acc_stderr": 0.011371658294311514,
"acc_norm": 0.27249022164276404,
"acc_norm_stderr": 0.011371658294311514
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.33088235294117646,
"acc_stderr": 0.02858270975389844,
"acc_norm": 0.33088235294117646,
"acc_norm_stderr": 0.02858270975389844
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.019206606848825365,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.019206606848825365
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5272727272727272,
"acc_stderr": 0.04782001791380061,
"acc_norm": 0.5272727272727272,
"acc_norm_stderr": 0.04782001791380061
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2897959183673469,
"acc_stderr": 0.02904308868330432,
"acc_norm": 0.2897959183673469,
"acc_norm_stderr": 0.02904308868330432
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.48258706467661694,
"acc_stderr": 0.03533389234739244,
"acc_norm": 0.48258706467661694,
"acc_norm_stderr": 0.03533389234739244
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4036144578313253,
"acc_stderr": 0.03819486140758398,
"acc_norm": 0.4036144578313253,
"acc_norm_stderr": 0.03819486140758398
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6432748538011696,
"acc_stderr": 0.03674013002860954,
"acc_norm": 0.6432748538011696,
"acc_norm_stderr": 0.03674013002860954
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29008567931456547,
"mc1_stderr": 0.01588623687420952,
"mc2": 0.4428923144531004,
"mc2_stderr": 0.014810370517699043
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
yzhuang/autotree_automl_credit_gosdt_l512_d3_sd3 | 2023-08-26T00:38:03.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 6767200000
num_examples: 100000
- name: validation
num_bytes: 676720000
num_examples: 10000
download_size: 1571720200
dataset_size: 7443920000
---
# Dataset Card for "autotree_automl_credit_gosdt_l512_d3_sd3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yzhuang/autotree_automl_credit_gosdt_l512_d3_sd2 | 2023-08-26T00:41:01.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 6767200000
num_examples: 100000
- name: validation
num_bytes: 676720000
num_examples: 10000
download_size: 1577909110
dataset_size: 7443920000
---
# Dataset Card for "autotree_automl_credit_gosdt_l512_d3_sd2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
totally-not-an-llm/PileV2-axolotlformat-smallmix | 2023-08-26T00:56:14.000Z | [
"region:us"
] | totally-not-an-llm | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_nicholasKluge__Aira-2-124M | 2023-08-27T12:43:08.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of nicholasKluge/Aira-2-124M
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [nicholasKluge/Aira-2-124M](https://huggingface.co/nicholasKluge/Aira-2-124M)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nicholasKluge__Aira-2-124M\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-26T00:58:54.483693](https://huggingface.co/datasets/open-llm-leaderboard/details_nicholasKluge__Aira-2-124M/blob/main/results_2023-08-26T00%3A58%3A54.483693.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2563784281118179,\n\
\ \"acc_stderr\": 0.03131922643477471,\n \"acc_norm\": 0.25747333491194596,\n\
\ \"acc_norm_stderr\": 0.03133423457395941,\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.014816195991931583,\n \"mc2\": 0.39825983953563676,\n\
\ \"mc2_stderr\": 0.014916655527587098\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2030716723549488,\n \"acc_stderr\": 0.011755899303705582,\n\
\ \"acc_norm\": 0.2431740614334471,\n \"acc_norm_stderr\": 0.012536554144587094\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2907787293367855,\n\
\ \"acc_stderr\": 0.004531935391507024,\n \"acc_norm\": 0.3152758414658435,\n\
\ \"acc_norm_stderr\": 0.004636760762522853\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.22962962962962963,\n\
\ \"acc_stderr\": 0.03633384414073461,\n \"acc_norm\": 0.22962962962962963,\n\
\ \"acc_norm_stderr\": 0.03633384414073461\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2490566037735849,\n \"acc_stderr\": 0.02661648298050171,\n\
\ \"acc_norm\": 0.2490566037735849,\n \"acc_norm_stderr\": 0.02661648298050171\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.32,\n\
\ \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \
\ \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24855491329479767,\n\
\ \"acc_stderr\": 0.03295304696818318,\n \"acc_norm\": 0.24855491329479767,\n\
\ \"acc_norm_stderr\": 0.03295304696818318\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2723404255319149,\n \"acc_stderr\": 0.0291012906983867,\n\
\ \"acc_norm\": 0.2723404255319149,\n \"acc_norm_stderr\": 0.0291012906983867\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322004,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322004\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.22758620689655173,\n \"acc_stderr\": 0.03493950380131184,\n\
\ \"acc_norm\": 0.22758620689655173,\n \"acc_norm_stderr\": 0.03493950380131184\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25132275132275134,\n \"acc_stderr\": 0.022340482339643895,\n \"\
acc_norm\": 0.25132275132275134,\n \"acc_norm_stderr\": 0.022340482339643895\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15873015873015872,\n\
\ \"acc_stderr\": 0.03268454013011743,\n \"acc_norm\": 0.15873015873015872,\n\
\ \"acc_norm_stderr\": 0.03268454013011743\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.14,\n \"acc_stderr\": 0.03487350880197771,\n \
\ \"acc_norm\": 0.14,\n \"acc_norm_stderr\": 0.03487350880197771\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25806451612903225,\n\
\ \"acc_stderr\": 0.024892469172462826,\n \"acc_norm\": 0.25806451612903225,\n\
\ \"acc_norm_stderr\": 0.024892469172462826\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.21674876847290642,\n \"acc_stderr\": 0.028990331252516235,\n\
\ \"acc_norm\": 0.21674876847290642,\n \"acc_norm_stderr\": 0.028990331252516235\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24848484848484848,\n \"acc_stderr\": 0.033744026441394036,\n\
\ \"acc_norm\": 0.24848484848484848,\n \"acc_norm_stderr\": 0.033744026441394036\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.35353535353535354,\n \"acc_stderr\": 0.03406086723547153,\n \"\
acc_norm\": 0.35353535353535354,\n \"acc_norm_stderr\": 0.03406086723547153\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.36787564766839376,\n \"acc_stderr\": 0.03480175668466036,\n\
\ \"acc_norm\": 0.36787564766839376,\n \"acc_norm_stderr\": 0.03480175668466036\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3717948717948718,\n \"acc_stderr\": 0.02450347255711094,\n \
\ \"acc_norm\": 0.3717948717948718,\n \"acc_norm_stderr\": 0.02450347255711094\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24444444444444444,\n \"acc_stderr\": 0.026202766534652148,\n \
\ \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.026202766534652148\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3025210084033613,\n \"acc_stderr\": 0.02983796238829193,\n \
\ \"acc_norm\": 0.3025210084033613,\n \"acc_norm_stderr\": 0.02983796238829193\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.23841059602649006,\n \"acc_stderr\": 0.0347918557259966,\n \"\
acc_norm\": 0.23841059602649006,\n \"acc_norm_stderr\": 0.0347918557259966\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3486238532110092,\n \"acc_stderr\": 0.020431254090714328,\n \"\
acc_norm\": 0.3486238532110092,\n \"acc_norm_stderr\": 0.020431254090714328\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25980392156862747,\n\
\ \"acc_stderr\": 0.030778554678693264,\n \"acc_norm\": 0.25980392156862747,\n\
\ \"acc_norm_stderr\": 0.030778554678693264\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.2616033755274262,\n \"acc_stderr\": 0.028609516716994934,\n\
\ \"acc_norm\": 0.2616033755274262,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.10762331838565023,\n\
\ \"acc_stderr\": 0.020799400082879997,\n \"acc_norm\": 0.10762331838565023,\n\
\ \"acc_norm_stderr\": 0.020799400082879997\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.21374045801526717,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.21374045801526717,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2231404958677686,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.2231404958677686,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2037037037037037,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.2037037037037037,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.24539877300613497,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.24539877300613497,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.23214285714285715,\n\
\ \"acc_stderr\": 0.040073418097558065,\n \"acc_norm\": 0.23214285714285715,\n\
\ \"acc_norm_stderr\": 0.040073418097558065\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3106796116504854,\n \"acc_stderr\": 0.0458212416016155,\n\
\ \"acc_norm\": 0.3106796116504854,\n \"acc_norm_stderr\": 0.0458212416016155\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.18803418803418803,\n\
\ \"acc_stderr\": 0.02559819368665225,\n \"acc_norm\": 0.18803418803418803,\n\
\ \"acc_norm_stderr\": 0.02559819368665225\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.227330779054917,\n\
\ \"acc_stderr\": 0.014987270640946015,\n \"acc_norm\": 0.227330779054917,\n\
\ \"acc_norm_stderr\": 0.014987270640946015\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.23410404624277456,\n \"acc_stderr\": 0.022797110278071138,\n\
\ \"acc_norm\": 0.23410404624277456,\n \"acc_norm_stderr\": 0.022797110278071138\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2679738562091503,\n \"acc_stderr\": 0.025360603796242553,\n\
\ \"acc_norm\": 0.2679738562091503,\n \"acc_norm_stderr\": 0.025360603796242553\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.19292604501607716,\n\
\ \"acc_stderr\": 0.022411516780911366,\n \"acc_norm\": 0.19292604501607716,\n\
\ \"acc_norm_stderr\": 0.022411516780911366\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.023132376234543343,\n\
\ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.023132376234543343\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2730496453900709,\n \"acc_stderr\": 0.026577860943307857,\n \
\ \"acc_norm\": 0.2730496453900709,\n \"acc_norm_stderr\": 0.026577860943307857\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2470664928292047,\n\
\ \"acc_stderr\": 0.011015752255279338,\n \"acc_norm\": 0.2470664928292047,\n\
\ \"acc_norm_stderr\": 0.011015752255279338\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121593,\n\
\ \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121593\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2549019607843137,\n \"acc_stderr\": 0.017630827375148383,\n \
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.017630827375148383\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.23636363636363636,\n\
\ \"acc_stderr\": 0.040693063197213754,\n \"acc_norm\": 0.23636363636363636,\n\
\ \"acc_norm_stderr\": 0.040693063197213754\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.031362502409358936,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.031362502409358936\n \
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n\
\ \"acc_stderr\": 0.030360490154014638,\n \"acc_norm\": 0.24378109452736318,\n\
\ \"acc_norm_stderr\": 0.030360490154014638\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.19879518072289157,\n\
\ \"acc_stderr\": 0.031069390260789437,\n \"acc_norm\": 0.19879518072289157,\n\
\ \"acc_norm_stderr\": 0.031069390260789437\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.03615507630310935,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.03615507630310935\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.014816195991931583,\n \"mc2\": 0.39825983953563676,\n\
\ \"mc2_stderr\": 0.014916655527587098\n }\n}\n```"
repo_url: https://huggingface.co/nicholasKluge/Aira-2-124M
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|arc:challenge|25_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hellaswag|10_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T00:58:54.483693.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T00:58:54.483693.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T00:58:54.483693.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T00:58:54.483693.parquet'
- config_name: results
data_files:
- split: 2023_08_26T00_58_54.483693
path:
- results_2023-08-26T00:58:54.483693.parquet
- split: latest
path:
- results_2023-08-26T00:58:54.483693.parquet
---
# Dataset Card for Evaluation run of nicholasKluge/Aira-2-124M
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/nicholasKluge/Aira-2-124M
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [nicholasKluge/Aira-2-124M](https://huggingface.co/nicholasKluge/Aira-2-124M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nicholasKluge__Aira-2-124M",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-26T00:58:54.483693](https://huggingface.co/datasets/open-llm-leaderboard/details_nicholasKluge__Aira-2-124M/blob/main/results_2023-08-26T00%3A58%3A54.483693.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2563784281118179,
"acc_stderr": 0.03131922643477471,
"acc_norm": 0.25747333491194596,
"acc_norm_stderr": 0.03133423457395941,
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931583,
"mc2": 0.39825983953563676,
"mc2_stderr": 0.014916655527587098
},
"harness|arc:challenge|25": {
"acc": 0.2030716723549488,
"acc_stderr": 0.011755899303705582,
"acc_norm": 0.2431740614334471,
"acc_norm_stderr": 0.012536554144587094
},
"harness|hellaswag|10": {
"acc": 0.2907787293367855,
"acc_stderr": 0.004531935391507024,
"acc_norm": 0.3152758414658435,
"acc_norm_stderr": 0.004636760762522853
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.03633384414073461,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.03633384414073461
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2490566037735849,
"acc_stderr": 0.02661648298050171,
"acc_norm": 0.2490566037735849,
"acc_norm_stderr": 0.02661648298050171
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.25,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.03295304696818318,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.03295304696818318
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2723404255319149,
"acc_stderr": 0.0291012906983867,
"acc_norm": 0.2723404255319149,
"acc_norm_stderr": 0.0291012906983867
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322004,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322004
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.22758620689655173,
"acc_stderr": 0.03493950380131184,
"acc_norm": 0.22758620689655173,
"acc_norm_stderr": 0.03493950380131184
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25132275132275134,
"acc_stderr": 0.022340482339643895,
"acc_norm": 0.25132275132275134,
"acc_norm_stderr": 0.022340482339643895
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15873015873015872,
"acc_stderr": 0.03268454013011743,
"acc_norm": 0.15873015873015872,
"acc_norm_stderr": 0.03268454013011743
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.14,
"acc_stderr": 0.03487350880197771,
"acc_norm": 0.14,
"acc_norm_stderr": 0.03487350880197771
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25806451612903225,
"acc_stderr": 0.024892469172462826,
"acc_norm": 0.25806451612903225,
"acc_norm_stderr": 0.024892469172462826
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.21674876847290642,
"acc_stderr": 0.028990331252516235,
"acc_norm": 0.21674876847290642,
"acc_norm_stderr": 0.028990331252516235
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24848484848484848,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.24848484848484848,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.35353535353535354,
"acc_stderr": 0.03406086723547153,
"acc_norm": 0.35353535353535354,
"acc_norm_stderr": 0.03406086723547153
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.36787564766839376,
"acc_stderr": 0.03480175668466036,
"acc_norm": 0.36787564766839376,
"acc_norm_stderr": 0.03480175668466036
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3717948717948718,
"acc_stderr": 0.02450347255711094,
"acc_norm": 0.3717948717948718,
"acc_norm_stderr": 0.02450347255711094
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.026202766534652148,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.026202766534652148
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3025210084033613,
"acc_stderr": 0.02983796238829193,
"acc_norm": 0.3025210084033613,
"acc_norm_stderr": 0.02983796238829193
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23841059602649006,
"acc_stderr": 0.0347918557259966,
"acc_norm": 0.23841059602649006,
"acc_norm_stderr": 0.0347918557259966
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3486238532110092,
"acc_stderr": 0.020431254090714328,
"acc_norm": 0.3486238532110092,
"acc_norm_stderr": 0.020431254090714328
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.030778554678693264,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.030778554678693264
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2616033755274262,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.2616033755274262,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.10762331838565023,
"acc_stderr": 0.020799400082879997,
"acc_norm": 0.10762331838565023,
"acc_norm_stderr": 0.020799400082879997
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.21374045801526717,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.21374045801526717,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2231404958677686,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.2231404958677686,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2037037037037037,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.2037037037037037,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.24539877300613497,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.24539877300613497,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.23214285714285715,
"acc_stderr": 0.040073418097558065,
"acc_norm": 0.23214285714285715,
"acc_norm_stderr": 0.040073418097558065
},
"harness|hendrycksTest-management|5": {
"acc": 0.3106796116504854,
"acc_stderr": 0.0458212416016155,
"acc_norm": 0.3106796116504854,
"acc_norm_stderr": 0.0458212416016155
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.18803418803418803,
"acc_stderr": 0.02559819368665225,
"acc_norm": 0.18803418803418803,
"acc_norm_stderr": 0.02559819368665225
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.227330779054917,
"acc_stderr": 0.014987270640946015,
"acc_norm": 0.227330779054917,
"acc_norm_stderr": 0.014987270640946015
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.23410404624277456,
"acc_stderr": 0.022797110278071138,
"acc_norm": 0.23410404624277456,
"acc_norm_stderr": 0.022797110278071138
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2679738562091503,
"acc_stderr": 0.025360603796242553,
"acc_norm": 0.2679738562091503,
"acc_norm_stderr": 0.025360603796242553
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.19292604501607716,
"acc_stderr": 0.022411516780911366,
"acc_norm": 0.19292604501607716,
"acc_norm_stderr": 0.022411516780911366
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.023132376234543343,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.023132376234543343
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2730496453900709,
"acc_stderr": 0.026577860943307857,
"acc_norm": 0.2730496453900709,
"acc_norm_stderr": 0.026577860943307857
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2470664928292047,
"acc_stderr": 0.011015752255279338,
"acc_norm": 0.2470664928292047,
"acc_norm_stderr": 0.011015752255279338
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.030211479609121593,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.030211479609121593
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.017630827375148383,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.017630827375148383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.040693063197213754,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.040693063197213754
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4,
"acc_stderr": 0.031362502409358936,
"acc_norm": 0.4,
"acc_norm_stderr": 0.031362502409358936
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.030360490154014638,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.030360490154014638
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-virology|5": {
"acc": 0.19879518072289157,
"acc_stderr": 0.031069390260789437,
"acc_norm": 0.19879518072289157,
"acc_norm_stderr": 0.031069390260789437
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.03615507630310935,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.03615507630310935
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931583,
"mc2": 0.39825983953563676,
"mc2_stderr": 0.014916655527587098
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
yzhuang/autotree_automl_credit_gosdt_l512_d3_sd1 | 2023-08-26T01:02:11.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 6767200000
num_examples: 100000
- name: validation
num_bytes: 676720000
num_examples: 10000
download_size: 1571238160
dataset_size: 7443920000
---
# Dataset Card for "autotree_automl_credit_gosdt_l512_d3_sd1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_chargoddard__platypus2-22b-relora | 2023-09-12T22:50:04.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of chargoddard/platypus2-22b-relora
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [chargoddard/platypus2-22b-relora](https://huggingface.co/chargoddard/platypus2-22b-relora)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chargoddard__platypus2-22b-relora\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-12T22:48:46.274282](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__platypus2-22b-relora/blob/main/results_2023-09-12T22-48-46.274282.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5540102068791638,\n\
\ \"acc_stderr\": 0.03433670784440371,\n \"acc_norm\": 0.5582488055284165,\n\
\ \"acc_norm_stderr\": 0.03431646480465005,\n \"mc1\": 0.30354957160342716,\n\
\ \"mc1_stderr\": 0.01609588415538685,\n \"mc2\": 0.43611604566165374,\n\
\ \"mc2_stderr\": 0.014524669084172097\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5358361774744027,\n \"acc_stderr\": 0.014573813664735718,\n\
\ \"acc_norm\": 0.5767918088737202,\n \"acc_norm_stderr\": 0.01443803622084804\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6153156741684923,\n\
\ \"acc_stderr\": 0.004855262903270804,\n \"acc_norm\": 0.8244373630750846,\n\
\ \"acc_norm_stderr\": 0.0037967010016923914\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5394736842105263,\n \"acc_stderr\": 0.04056242252249033,\n\
\ \"acc_norm\": 0.5394736842105263,\n \"acc_norm_stderr\": 0.04056242252249033\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5811320754716981,\n \"acc_stderr\": 0.030365050829115208,\n\
\ \"acc_norm\": 0.5811320754716981,\n \"acc_norm_stderr\": 0.030365050829115208\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6180555555555556,\n\
\ \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.6180555555555556,\n\
\ \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5260115606936416,\n\
\ \"acc_stderr\": 0.038073017265045125,\n \"acc_norm\": 0.5260115606936416,\n\
\ \"acc_norm_stderr\": 0.038073017265045125\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617748,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617748\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4085106382978723,\n \"acc_stderr\": 0.03213418026701576,\n\
\ \"acc_norm\": 0.4085106382978723,\n \"acc_norm_stderr\": 0.03213418026701576\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.041443118108781526,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.041443118108781526\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.31746031746031744,\n \"acc_stderr\": 0.023973861998992072,\n \"\
acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.023973861998992072\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6451612903225806,\n\
\ \"acc_stderr\": 0.02721888977330877,\n \"acc_norm\": 0.6451612903225806,\n\
\ \"acc_norm_stderr\": 0.02721888977330877\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4187192118226601,\n \"acc_stderr\": 0.03471192860518468,\n\
\ \"acc_norm\": 0.4187192118226601,\n \"acc_norm_stderr\": 0.03471192860518468\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.696969696969697,\n \"acc_stderr\": 0.03588624800091706,\n\
\ \"acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.03588624800091706\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6868686868686869,\n \"acc_stderr\": 0.033042050878136525,\n \"\
acc_norm\": 0.6868686868686869,\n \"acc_norm_stderr\": 0.033042050878136525\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8186528497409327,\n \"acc_stderr\": 0.02780703236068609,\n\
\ \"acc_norm\": 0.8186528497409327,\n \"acc_norm_stderr\": 0.02780703236068609\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.517948717948718,\n \"acc_stderr\": 0.025334667080954925,\n \
\ \"acc_norm\": 0.517948717948718,\n \"acc_norm_stderr\": 0.025334667080954925\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.02803792996911499,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.02803792996911499\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.03242225027115007,\n \
\ \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.03242225027115007\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943342,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943342\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7339449541284404,\n\
\ \"acc_stderr\": 0.0189460223222256,\n \"acc_norm\": 0.7339449541284404,\n\
\ \"acc_norm_stderr\": 0.0189460223222256\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.4027777777777778,\n \"acc_stderr\": 0.03344887382997867,\n\
\ \"acc_norm\": 0.4027777777777778,\n \"acc_norm_stderr\": 0.03344887382997867\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7450980392156863,\n \"acc_stderr\": 0.03058759135160425,\n \"\
acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.03058759135160425\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.759493670886076,\n \"acc_stderr\": 0.027820781981149685,\n \
\ \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.027820781981149685\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n\
\ \"acc_stderr\": 0.03227790442850499,\n \"acc_norm\": 0.6367713004484304,\n\
\ \"acc_norm_stderr\": 0.03227790442850499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6335877862595419,\n \"acc_stderr\": 0.04225875451969638,\n\
\ \"acc_norm\": 0.6335877862595419,\n \"acc_norm_stderr\": 0.04225875451969638\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.039418975265163025,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.039418975265163025\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n\
\ \"acc_stderr\": 0.04414343666854933,\n \"acc_norm\": 0.7037037037037037,\n\
\ \"acc_norm_stderr\": 0.04414343666854933\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.656441717791411,\n \"acc_stderr\": 0.037311335196738925,\n\
\ \"acc_norm\": 0.656441717791411,\n \"acc_norm_stderr\": 0.037311335196738925\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7863247863247863,\n\
\ \"acc_stderr\": 0.026853450377009164,\n \"acc_norm\": 0.7863247863247863,\n\
\ \"acc_norm_stderr\": 0.026853450377009164\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488583\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7535121328224776,\n\
\ \"acc_stderr\": 0.015411308769686933,\n \"acc_norm\": 0.7535121328224776,\n\
\ \"acc_norm_stderr\": 0.015411308769686933\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.025992472029306393,\n\
\ \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.025992472029306393\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33743016759776534,\n\
\ \"acc_stderr\": 0.015813901283913044,\n \"acc_norm\": 0.33743016759776534,\n\
\ \"acc_norm_stderr\": 0.015813901283913044\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6078431372549019,\n \"acc_stderr\": 0.027956046165424516,\n\
\ \"acc_norm\": 0.6078431372549019,\n \"acc_norm_stderr\": 0.027956046165424516\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6302250803858521,\n\
\ \"acc_stderr\": 0.027417996705630995,\n \"acc_norm\": 0.6302250803858521,\n\
\ \"acc_norm_stderr\": 0.027417996705630995\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6450617283950617,\n \"acc_stderr\": 0.02662415247884585,\n\
\ \"acc_norm\": 0.6450617283950617,\n \"acc_norm_stderr\": 0.02662415247884585\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.41134751773049644,\n \"acc_stderr\": 0.029354911159940975,\n \
\ \"acc_norm\": 0.41134751773049644,\n \"acc_norm_stderr\": 0.029354911159940975\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4217731421121252,\n\
\ \"acc_stderr\": 0.012612974369390979,\n \"acc_norm\": 0.4217731421121252,\n\
\ \"acc_norm_stderr\": 0.012612974369390979\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4852941176470588,\n \"acc_stderr\": 0.03035969707904611,\n\
\ \"acc_norm\": 0.4852941176470588,\n \"acc_norm_stderr\": 0.03035969707904611\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5620915032679739,\n \"acc_stderr\": 0.020071257886886525,\n \
\ \"acc_norm\": 0.5620915032679739,\n \"acc_norm_stderr\": 0.020071257886886525\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6122448979591837,\n \"acc_stderr\": 0.031192230726795656,\n\
\ \"acc_norm\": 0.6122448979591837,\n \"acc_norm_stderr\": 0.031192230726795656\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7114427860696517,\n\
\ \"acc_stderr\": 0.03203841040213322,\n \"acc_norm\": 0.7114427860696517,\n\
\ \"acc_norm_stderr\": 0.03203841040213322\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036847,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036847\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4457831325301205,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.4457831325301205,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.03246721765117825,\n\
\ \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.03246721765117825\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30354957160342716,\n\
\ \"mc1_stderr\": 0.01609588415538685,\n \"mc2\": 0.43611604566165374,\n\
\ \"mc2_stderr\": 0.014524669084172097\n }\n}\n```"
repo_url: https://huggingface.co/chargoddard/platypus2-22b-relora
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|arc:challenge|25_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|arc:challenge|25_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hellaswag|10_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hellaswag|10_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T01:19:46.876046.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T22-48-46.274282.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T22-48-46.274282.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T01:19:46.876046.parquet'
- split: 2023_09_12T22_48_46.274282
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T22-48-46.274282.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T22-48-46.274282.parquet'
- config_name: results
data_files:
- split: 2023_08_26T01_19_46.876046
path:
- results_2023-08-26T01:19:46.876046.parquet
- split: 2023_09_12T22_48_46.274282
path:
- results_2023-09-12T22-48-46.274282.parquet
- split: latest
path:
- results_2023-09-12T22-48-46.274282.parquet
---
# Dataset Card for Evaluation run of chargoddard/platypus2-22b-relora
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/chargoddard/platypus2-22b-relora
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [chargoddard/platypus2-22b-relora](https://huggingface.co/chargoddard/platypus2-22b-relora) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_chargoddard__platypus2-22b-relora",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-12T22:48:46.274282](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__platypus2-22b-relora/blob/main/results_2023-09-12T22-48-46.274282.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5540102068791638,
"acc_stderr": 0.03433670784440371,
"acc_norm": 0.5582488055284165,
"acc_norm_stderr": 0.03431646480465005,
"mc1": 0.30354957160342716,
"mc1_stderr": 0.01609588415538685,
"mc2": 0.43611604566165374,
"mc2_stderr": 0.014524669084172097
},
"harness|arc:challenge|25": {
"acc": 0.5358361774744027,
"acc_stderr": 0.014573813664735718,
"acc_norm": 0.5767918088737202,
"acc_norm_stderr": 0.01443803622084804
},
"harness|hellaswag|10": {
"acc": 0.6153156741684923,
"acc_stderr": 0.004855262903270804,
"acc_norm": 0.8244373630750846,
"acc_norm_stderr": 0.0037967010016923914
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5394736842105263,
"acc_stderr": 0.04056242252249033,
"acc_norm": 0.5394736842105263,
"acc_norm_stderr": 0.04056242252249033
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5811320754716981,
"acc_stderr": 0.030365050829115208,
"acc_norm": 0.5811320754716981,
"acc_norm_stderr": 0.030365050829115208
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6180555555555556,
"acc_stderr": 0.040629907841466674,
"acc_norm": 0.6180555555555556,
"acc_norm_stderr": 0.040629907841466674
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5260115606936416,
"acc_stderr": 0.038073017265045125,
"acc_norm": 0.5260115606936416,
"acc_norm_stderr": 0.038073017265045125
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617748,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617748
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4085106382978723,
"acc_stderr": 0.03213418026701576,
"acc_norm": 0.4085106382978723,
"acc_norm_stderr": 0.03213418026701576
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.041443118108781526,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.041443118108781526
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.023973861998992072,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.023973861998992072
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6451612903225806,
"acc_stderr": 0.02721888977330877,
"acc_norm": 0.6451612903225806,
"acc_norm_stderr": 0.02721888977330877
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4187192118226601,
"acc_stderr": 0.03471192860518468,
"acc_norm": 0.4187192118226601,
"acc_norm_stderr": 0.03471192860518468
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.03588624800091706,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.03588624800091706
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6868686868686869,
"acc_stderr": 0.033042050878136525,
"acc_norm": 0.6868686868686869,
"acc_norm_stderr": 0.033042050878136525
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8186528497409327,
"acc_stderr": 0.02780703236068609,
"acc_norm": 0.8186528497409327,
"acc_norm_stderr": 0.02780703236068609
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.517948717948718,
"acc_stderr": 0.025334667080954925,
"acc_norm": 0.517948717948718,
"acc_norm_stderr": 0.025334667080954925
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.02803792996911499,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.02803792996911499
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.03242225027115007,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.03242225027115007
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943342,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943342
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7339449541284404,
"acc_stderr": 0.0189460223222256,
"acc_norm": 0.7339449541284404,
"acc_norm_stderr": 0.0189460223222256
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4027777777777778,
"acc_stderr": 0.03344887382997867,
"acc_norm": 0.4027777777777778,
"acc_norm_stderr": 0.03344887382997867
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.03058759135160425,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.03058759135160425
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.027820781981149685,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.027820781981149685
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6367713004484304,
"acc_stderr": 0.03227790442850499,
"acc_norm": 0.6367713004484304,
"acc_norm_stderr": 0.03227790442850499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6335877862595419,
"acc_stderr": 0.04225875451969638,
"acc_norm": 0.6335877862595419,
"acc_norm_stderr": 0.04225875451969638
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.04414343666854933,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.04414343666854933
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.656441717791411,
"acc_stderr": 0.037311335196738925,
"acc_norm": 0.656441717791411,
"acc_norm_stderr": 0.037311335196738925
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7863247863247863,
"acc_stderr": 0.026853450377009164,
"acc_norm": 0.7863247863247863,
"acc_norm_stderr": 0.026853450377009164
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7535121328224776,
"acc_stderr": 0.015411308769686933,
"acc_norm": 0.7535121328224776,
"acc_norm_stderr": 0.015411308769686933
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.025992472029306393,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.025992472029306393
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.33743016759776534,
"acc_stderr": 0.015813901283913044,
"acc_norm": 0.33743016759776534,
"acc_norm_stderr": 0.015813901283913044
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6078431372549019,
"acc_stderr": 0.027956046165424516,
"acc_norm": 0.6078431372549019,
"acc_norm_stderr": 0.027956046165424516
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6302250803858521,
"acc_stderr": 0.027417996705630995,
"acc_norm": 0.6302250803858521,
"acc_norm_stderr": 0.027417996705630995
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6450617283950617,
"acc_stderr": 0.02662415247884585,
"acc_norm": 0.6450617283950617,
"acc_norm_stderr": 0.02662415247884585
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.41134751773049644,
"acc_stderr": 0.029354911159940975,
"acc_norm": 0.41134751773049644,
"acc_norm_stderr": 0.029354911159940975
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4217731421121252,
"acc_stderr": 0.012612974369390979,
"acc_norm": 0.4217731421121252,
"acc_norm_stderr": 0.012612974369390979
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4852941176470588,
"acc_stderr": 0.03035969707904611,
"acc_norm": 0.4852941176470588,
"acc_norm_stderr": 0.03035969707904611
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5620915032679739,
"acc_stderr": 0.020071257886886525,
"acc_norm": 0.5620915032679739,
"acc_norm_stderr": 0.020071257886886525
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6122448979591837,
"acc_stderr": 0.031192230726795656,
"acc_norm": 0.6122448979591837,
"acc_norm_stderr": 0.031192230726795656
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7114427860696517,
"acc_stderr": 0.03203841040213322,
"acc_norm": 0.7114427860696517,
"acc_norm_stderr": 0.03203841040213322
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036847,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036847
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4457831325301205,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.4457831325301205,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.03246721765117825,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.03246721765117825
},
"harness|truthfulqa:mc|0": {
"mc1": 0.30354957160342716,
"mc1_stderr": 0.01609588415538685,
"mc2": 0.43611604566165374,
"mc2_stderr": 0.014524669084172097
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_TheBloke__CodeLlama-34B-Instruct-fp16 | 2023-08-27T12:43:12.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TheBloke/CodeLlama-34B-Instruct-fp16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/CodeLlama-34B-Instruct-fp16](https://huggingface.co/TheBloke/CodeLlama-34B-Instruct-fp16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__CodeLlama-34B-Instruct-fp16\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-26T01:22:34.444520](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__CodeLlama-34B-Instruct-fp16/blob/main/results_2023-08-26T01%3A22%3A34.444520.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.39529982814936127,\n\
\ \"acc_stderr\": 0.03498378261854782,\n \"acc_norm\": 0.39673912641820325,\n\
\ \"acc_norm_stderr\": 0.03499033569271049,\n \"mc1\": 0.29008567931456547,\n\
\ \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.4428923144531004,\n\
\ \"mc2_stderr\": 0.014810370517699043\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3796928327645051,\n \"acc_stderr\": 0.014182119866974876,\n\
\ \"acc_norm\": 0.40784982935153585,\n \"acc_norm_stderr\": 0.014361097288449708\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2998406691894045,\n\
\ \"acc_stderr\": 0.004572515919210699,\n \"acc_norm\": 0.35660227046405096,\n\
\ \"acc_norm_stderr\": 0.00478016987333286\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.35555555555555557,\n\
\ \"acc_stderr\": 0.04135176749720386,\n \"acc_norm\": 0.35555555555555557,\n\
\ \"acc_norm_stderr\": 0.04135176749720386\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3684210526315789,\n \"acc_stderr\": 0.03925523381052932,\n\
\ \"acc_norm\": 0.3684210526315789,\n \"acc_norm_stderr\": 0.03925523381052932\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.44,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4037735849056604,\n \"acc_stderr\": 0.030197611600197953,\n\
\ \"acc_norm\": 0.4037735849056604,\n \"acc_norm_stderr\": 0.030197611600197953\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3680555555555556,\n\
\ \"acc_stderr\": 0.04032999053960718,\n \"acc_norm\": 0.3680555555555556,\n\
\ \"acc_norm_stderr\": 0.04032999053960718\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3583815028901734,\n\
\ \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.3583815028901734,\n\
\ \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3872340425531915,\n \"acc_stderr\": 0.03184389265339525,\n\
\ \"acc_norm\": 0.3872340425531915,\n \"acc_norm_stderr\": 0.03184389265339525\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.04096985139843672,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.04096985139843672\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3448275862068966,\n \"acc_stderr\": 0.03960933549451208,\n\
\ \"acc_norm\": 0.3448275862068966,\n \"acc_norm_stderr\": 0.03960933549451208\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30423280423280424,\n \"acc_stderr\": 0.02369541500946309,\n \"\
acc_norm\": 0.30423280423280424,\n \"acc_norm_stderr\": 0.02369541500946309\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.44193548387096776,\n \"acc_stderr\": 0.02825155790684974,\n \"\
acc_norm\": 0.44193548387096776,\n \"acc_norm_stderr\": 0.02825155790684974\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3645320197044335,\n \"acc_stderr\": 0.033864057460620905,\n \"\
acc_norm\": 0.3645320197044335,\n \"acc_norm_stderr\": 0.033864057460620905\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.03453131801885415,\n\
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.03453131801885415\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5202020202020202,\n \"acc_stderr\": 0.03559443565563918,\n \"\
acc_norm\": 0.5202020202020202,\n \"acc_norm_stderr\": 0.03559443565563918\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.533678756476684,\n \"acc_stderr\": 0.036002440698671784,\n\
\ \"acc_norm\": 0.533678756476684,\n \"acc_norm_stderr\": 0.036002440698671784\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3564102564102564,\n \"acc_stderr\": 0.024283140529467295,\n\
\ \"acc_norm\": 0.3564102564102564,\n \"acc_norm_stderr\": 0.024283140529467295\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683515,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683515\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.39915966386554624,\n \"acc_stderr\": 0.031811100324139245,\n\
\ \"acc_norm\": 0.39915966386554624,\n \"acc_norm_stderr\": 0.031811100324139245\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.45321100917431195,\n \"acc_stderr\": 0.021343255165546037,\n \"\
acc_norm\": 0.45321100917431195,\n \"acc_norm_stderr\": 0.021343255165546037\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.24074074074074073,\n \"acc_stderr\": 0.029157522184605596,\n \"\
acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.029157522184605596\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.28431372549019607,\n \"acc_stderr\": 0.031660096793998116,\n \"\
acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.031660096793998116\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.4092827004219409,\n \"acc_stderr\": 0.032007041833595914,\n \
\ \"acc_norm\": 0.4092827004219409,\n \"acc_norm_stderr\": 0.032007041833595914\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4260089686098655,\n\
\ \"acc_stderr\": 0.0331883328621728,\n \"acc_norm\": 0.4260089686098655,\n\
\ \"acc_norm_stderr\": 0.0331883328621728\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4351145038167939,\n \"acc_stderr\": 0.04348208051644858,\n\
\ \"acc_norm\": 0.4351145038167939,\n \"acc_norm_stderr\": 0.04348208051644858\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5702479338842975,\n \"acc_stderr\": 0.04519082021319773,\n \"\
acc_norm\": 0.5702479338842975,\n \"acc_norm_stderr\": 0.04519082021319773\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5370370370370371,\n\
\ \"acc_stderr\": 0.04820403072760628,\n \"acc_norm\": 0.5370370370370371,\n\
\ \"acc_norm_stderr\": 0.04820403072760628\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3803680981595092,\n \"acc_stderr\": 0.03814269893261837,\n\
\ \"acc_norm\": 0.3803680981595092,\n \"acc_norm_stderr\": 0.03814269893261837\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.26785714285714285,\n\
\ \"acc_stderr\": 0.04203277291467762,\n \"acc_norm\": 0.26785714285714285,\n\
\ \"acc_norm_stderr\": 0.04203277291467762\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5436893203883495,\n \"acc_stderr\": 0.049318019942204146,\n\
\ \"acc_norm\": 0.5436893203883495,\n \"acc_norm_stderr\": 0.049318019942204146\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6196581196581197,\n\
\ \"acc_stderr\": 0.03180425204384099,\n \"acc_norm\": 0.6196581196581197,\n\
\ \"acc_norm_stderr\": 0.03180425204384099\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-miscellaneous|5\"\
: {\n \"acc\": 0.565772669220945,\n \"acc_stderr\": 0.017724589389677785,\n\
\ \"acc_norm\": 0.565772669220945,\n \"acc_norm_stderr\": 0.017724589389677785\n\
\ },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.41040462427745666,\n\
\ \"acc_stderr\": 0.02648339204209818,\n \"acc_norm\": 0.41040462427745666,\n\
\ \"acc_norm_stderr\": 0.02648339204209818\n },\n \"harness|hendrycksTest-moral_scenarios|5\"\
: {\n \"acc\": 0.20446927374301677,\n \"acc_stderr\": 0.013488813404711917,\n\
\ \"acc_norm\": 0.20446927374301677,\n \"acc_norm_stderr\": 0.013488813404711917\n\
\ },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.4117647058823529,\n\
\ \"acc_stderr\": 0.02818059632825929,\n \"acc_norm\": 0.4117647058823529,\n\
\ \"acc_norm_stderr\": 0.02818059632825929\n },\n \"harness|hendrycksTest-philosophy|5\"\
: {\n \"acc\": 0.5048231511254019,\n \"acc_stderr\": 0.028396770444111298,\n\
\ \"acc_norm\": 0.5048231511254019,\n \"acc_norm_stderr\": 0.028396770444111298\n\
\ },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.4567901234567901,\n\
\ \"acc_stderr\": 0.02771666165019404,\n \"acc_norm\": 0.4567901234567901,\n\
\ \"acc_norm_stderr\": 0.02771666165019404\n },\n \"harness|hendrycksTest-professional_accounting|5\"\
: {\n \"acc\": 0.31560283687943264,\n \"acc_stderr\": 0.027724989449509314,\n\
\ \"acc_norm\": 0.31560283687943264,\n \"acc_norm_stderr\": 0.027724989449509314\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.27835723598435463,\n\
\ \"acc_stderr\": 0.011446990197380985,\n \"acc_norm\": 0.27835723598435463,\n\
\ \"acc_norm_stderr\": 0.011446990197380985\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3272058823529412,\n \"acc_stderr\": 0.028501452860396563,\n\
\ \"acc_norm\": 0.3272058823529412,\n \"acc_norm_stderr\": 0.028501452860396563\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3431372549019608,\n \"acc_stderr\": 0.019206606848825365,\n \
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.019206606848825365\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n\
\ \"acc_stderr\": 0.04782001791380061,\n \"acc_norm\": 0.5272727272727272,\n\
\ \"acc_norm_stderr\": 0.04782001791380061\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.0289205832206756,\n\
\ \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.0289205832206756\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.48258706467661694,\n\
\ \"acc_stderr\": 0.03533389234739244,\n \"acc_norm\": 0.48258706467661694,\n\
\ \"acc_norm_stderr\": 0.03533389234739244\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4036144578313253,\n\
\ \"acc_stderr\": 0.03819486140758398,\n \"acc_norm\": 0.4036144578313253,\n\
\ \"acc_norm_stderr\": 0.03819486140758398\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6432748538011696,\n \"acc_stderr\": 0.03674013002860954,\n\
\ \"acc_norm\": 0.6432748538011696,\n \"acc_norm_stderr\": 0.03674013002860954\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29008567931456547,\n\
\ \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.4428923144531004,\n\
\ \"mc2_stderr\": 0.014810370517699043\n }\n}\n```"
repo_url: https://huggingface.co/TheBloke/CodeLlama-34B-Instruct-fp16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|arc:challenge|25_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hellaswag|10_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T01:22:34.444520.parquet'
- config_name: results
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- results_2023-08-26T01:22:34.444520.parquet
- split: latest
path:
- results_2023-08-26T01:22:34.444520.parquet
---
# Dataset Card for Evaluation run of TheBloke/CodeLlama-34B-Instruct-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/CodeLlama-34B-Instruct-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/CodeLlama-34B-Instruct-fp16](https://huggingface.co/TheBloke/CodeLlama-34B-Instruct-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__CodeLlama-34B-Instruct-fp16",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-26T01:22:34.444520](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__CodeLlama-34B-Instruct-fp16/blob/main/results_2023-08-26T01%3A22%3A34.444520.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.39529982814936127,
"acc_stderr": 0.03498378261854782,
"acc_norm": 0.39673912641820325,
"acc_norm_stderr": 0.03499033569271049,
"mc1": 0.29008567931456547,
"mc1_stderr": 0.01588623687420952,
"mc2": 0.4428923144531004,
"mc2_stderr": 0.014810370517699043
},
"harness|arc:challenge|25": {
"acc": 0.3796928327645051,
"acc_stderr": 0.014182119866974876,
"acc_norm": 0.40784982935153585,
"acc_norm_stderr": 0.014361097288449708
},
"harness|hellaswag|10": {
"acc": 0.2998406691894045,
"acc_stderr": 0.004572515919210699,
"acc_norm": 0.35660227046405096,
"acc_norm_stderr": 0.00478016987333286
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.04135176749720386,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.04135176749720386
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4037735849056604,
"acc_stderr": 0.030197611600197953,
"acc_norm": 0.4037735849056604,
"acc_norm_stderr": 0.030197611600197953
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3680555555555556,
"acc_stderr": 0.04032999053960718,
"acc_norm": 0.3680555555555556,
"acc_norm_stderr": 0.04032999053960718
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3583815028901734,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.3583815028901734,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3872340425531915,
"acc_stderr": 0.03184389265339525,
"acc_norm": 0.3872340425531915,
"acc_norm_stderr": 0.03184389265339525
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.04096985139843672,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.04096985139843672
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3448275862068966,
"acc_stderr": 0.03960933549451208,
"acc_norm": 0.3448275862068966,
"acc_norm_stderr": 0.03960933549451208
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30423280423280424,
"acc_stderr": 0.02369541500946309,
"acc_norm": 0.30423280423280424,
"acc_norm_stderr": 0.02369541500946309
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.44193548387096776,
"acc_stderr": 0.02825155790684974,
"acc_norm": 0.44193548387096776,
"acc_norm_stderr": 0.02825155790684974
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3645320197044335,
"acc_stderr": 0.033864057460620905,
"acc_norm": 0.3645320197044335,
"acc_norm_stderr": 0.033864057460620905
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.03453131801885415,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.03453131801885415
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5202020202020202,
"acc_stderr": 0.03559443565563918,
"acc_norm": 0.5202020202020202,
"acc_norm_stderr": 0.03559443565563918
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.533678756476684,
"acc_stderr": 0.036002440698671784,
"acc_norm": 0.533678756476684,
"acc_norm_stderr": 0.036002440698671784
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3564102564102564,
"acc_stderr": 0.024283140529467295,
"acc_norm": 0.3564102564102564,
"acc_norm_stderr": 0.024283140529467295
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683515,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683515
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.39915966386554624,
"acc_stderr": 0.031811100324139245,
"acc_norm": 0.39915966386554624,
"acc_norm_stderr": 0.031811100324139245
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.45321100917431195,
"acc_stderr": 0.021343255165546037,
"acc_norm": 0.45321100917431195,
"acc_norm_stderr": 0.021343255165546037
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.029157522184605596,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.029157522184605596
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.031660096793998116,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.031660096793998116
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4092827004219409,
"acc_stderr": 0.032007041833595914,
"acc_norm": 0.4092827004219409,
"acc_norm_stderr": 0.032007041833595914
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4260089686098655,
"acc_stderr": 0.0331883328621728,
"acc_norm": 0.4260089686098655,
"acc_norm_stderr": 0.0331883328621728
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4351145038167939,
"acc_stderr": 0.04348208051644858,
"acc_norm": 0.4351145038167939,
"acc_norm_stderr": 0.04348208051644858
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5702479338842975,
"acc_stderr": 0.04519082021319773,
"acc_norm": 0.5702479338842975,
"acc_norm_stderr": 0.04519082021319773
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.04820403072760628,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.04820403072760628
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3803680981595092,
"acc_stderr": 0.03814269893261837,
"acc_norm": 0.3803680981595092,
"acc_norm_stderr": 0.03814269893261837
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.26785714285714285,
"acc_stderr": 0.04203277291467762,
"acc_norm": 0.26785714285714285,
"acc_norm_stderr": 0.04203277291467762
},
"harness|hendrycksTest-management|5": {
"acc": 0.5436893203883495,
"acc_stderr": 0.049318019942204146,
"acc_norm": 0.5436893203883495,
"acc_norm_stderr": 0.049318019942204146
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6196581196581197,
"acc_stderr": 0.03180425204384099,
"acc_norm": 0.6196581196581197,
"acc_norm_stderr": 0.03180425204384099
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.565772669220945,
"acc_stderr": 0.017724589389677785,
"acc_norm": 0.565772669220945,
"acc_norm_stderr": 0.017724589389677785
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.41040462427745666,
"acc_stderr": 0.02648339204209818,
"acc_norm": 0.41040462427745666,
"acc_norm_stderr": 0.02648339204209818
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.20446927374301677,
"acc_stderr": 0.013488813404711917,
"acc_norm": 0.20446927374301677,
"acc_norm_stderr": 0.013488813404711917
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.02818059632825929,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.02818059632825929
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5048231511254019,
"acc_stderr": 0.028396770444111298,
"acc_norm": 0.5048231511254019,
"acc_norm_stderr": 0.028396770444111298
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4567901234567901,
"acc_stderr": 0.02771666165019404,
"acc_norm": 0.4567901234567901,
"acc_norm_stderr": 0.02771666165019404
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.31560283687943264,
"acc_stderr": 0.027724989449509314,
"acc_norm": 0.31560283687943264,
"acc_norm_stderr": 0.027724989449509314
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.27835723598435463,
"acc_stderr": 0.011446990197380985,
"acc_norm": 0.27835723598435463,
"acc_norm_stderr": 0.011446990197380985
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3272058823529412,
"acc_stderr": 0.028501452860396563,
"acc_norm": 0.3272058823529412,
"acc_norm_stderr": 0.028501452860396563
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.019206606848825365,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.019206606848825365
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5272727272727272,
"acc_stderr": 0.04782001791380061,
"acc_norm": 0.5272727272727272,
"acc_norm_stderr": 0.04782001791380061
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0289205832206756,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0289205832206756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.48258706467661694,
"acc_stderr": 0.03533389234739244,
"acc_norm": 0.48258706467661694,
"acc_norm_stderr": 0.03533389234739244
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4036144578313253,
"acc_stderr": 0.03819486140758398,
"acc_norm": 0.4036144578313253,
"acc_norm_stderr": 0.03819486140758398
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6432748538011696,
"acc_stderr": 0.03674013002860954,
"acc_norm": 0.6432748538011696,
"acc_norm_stderr": 0.03674013002860954
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29008567931456547,
"mc1_stderr": 0.01588623687420952,
"mc2": 0.4428923144531004,
"mc2_stderr": 0.014810370517699043
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
yzhuang/autotree_automl_pol_gosdt_l512_d3_sd2 | 2023-08-26T01:42:14.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: int64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: int64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 13320800000
num_examples: 100000
- name: validation
num_bytes: 1332080000
num_examples: 10000
download_size: 958172306
dataset_size: 14652880000
---
# Dataset Card for "autotree_automl_pol_gosdt_l512_d3_sd2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yzhuang/autotree_automl_pol_gosdt_l512_d3_sd3 | 2023-08-26T01:52:24.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: int64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: int64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 13320800000
num_examples: 100000
- name: validation
num_bytes: 1332080000
num_examples: 10000
download_size: 959817651
dataset_size: 14652880000
---
# Dataset Card for "autotree_automl_pol_gosdt_l512_d3_sd3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_ehartford__CodeLlama-34b-Python-hf | 2023-09-17T22:02:53.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of ehartford/CodeLlama-34b-Python-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ehartford/CodeLlama-34b-Python-hf](https://huggingface.co/ehartford/CodeLlama-34b-Python-hf)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ehartford__CodeLlama-34b-Python-hf\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T22:02:41.600326](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__CodeLlama-34b-Python-hf/blob/main/results_2023-09-17T22-02-41.600326.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0012583892617449664,\n\
\ \"em_stderr\": 0.00036305608931190325,\n \"f1\": 0.0019200922818791944,\n\
\ \"f1_stderr\": 0.0004138356823487018,\n \"acc\": 0.3307024467245462,\n\
\ \"acc_stderr\": 0.006650084932921209\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.00036305608931190325,\n\
\ \"f1\": 0.0019200922818791944,\n \"f1_stderr\": 0.0004138356823487018\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6614048934490924,\n\
\ \"acc_stderr\": 0.013300169865842417\n }\n}\n```"
repo_url: https://huggingface.co/ehartford/CodeLlama-34b-Python-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|arc:challenge|25_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T22_02_41.600326
path:
- '**/details_harness|drop|3_2023-09-17T22-02-41.600326.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T22-02-41.600326.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T22_02_41.600326
path:
- '**/details_harness|gsm8k|5_2023-09-17T22-02-41.600326.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T22-02-41.600326.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hellaswag|10_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T01:57:15.339948.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T01:57:15.339948.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T01:57:15.339948.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T22_02_41.600326
path:
- '**/details_harness|winogrande|5_2023-09-17T22-02-41.600326.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T22-02-41.600326.parquet'
- config_name: results
data_files:
- split: 2023_08_26T01_57_15.339948
path:
- results_2023-08-26T01:57:15.339948.parquet
- split: 2023_09_17T22_02_41.600326
path:
- results_2023-09-17T22-02-41.600326.parquet
- split: latest
path:
- results_2023-09-17T22-02-41.600326.parquet
---
# Dataset Card for Evaluation run of ehartford/CodeLlama-34b-Python-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ehartford/CodeLlama-34b-Python-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ehartford/CodeLlama-34b-Python-hf](https://huggingface.co/ehartford/CodeLlama-34b-Python-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ehartford__CodeLlama-34b-Python-hf",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T22:02:41.600326](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__CodeLlama-34b-Python-hf/blob/main/results_2023-09-17T22-02-41.600326.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0012583892617449664,
"em_stderr": 0.00036305608931190325,
"f1": 0.0019200922818791944,
"f1_stderr": 0.0004138356823487018,
"acc": 0.3307024467245462,
"acc_stderr": 0.006650084932921209
},
"harness|drop|3": {
"em": 0.0012583892617449664,
"em_stderr": 0.00036305608931190325,
"f1": 0.0019200922818791944,
"f1_stderr": 0.0004138356823487018
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.6614048934490924,
"acc_stderr": 0.013300169865842417
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
yzhuang/autotree_automl_pol_gosdt_l512_d3_sd1 | 2023-08-26T02:14:59.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: int64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: int64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 13320800000
num_examples: 100000
- name: validation
num_bytes: 1332080000
num_examples: 10000
download_size: 960146994
dataset_size: 14652880000
---
# Dataset Card for "autotree_automl_pol_gosdt_l512_d3_sd1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ashwincv0112/Master_Course_Application_Email_avp | 2023-08-26T02:16:06.000Z | [
"region:us"
] | ashwincv0112 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: Course_Name
dtype: string
- name: Course_Description
dtype: string
- name: Admission_Email
dtype: string
splits:
- name: train
num_bytes: 31004
num_examples: 10
download_size: 34509
dataset_size: 31004
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Master_Course_Application_Email_avp"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NarchAI1992/bigfile_Luxtury_walnut | 2023-08-26T02:23:45.000Z | [
"license:openrail",
"region:us"
] | NarchAI1992 | null | null | null | 0 | 0 | ---
license: openrail
---
|
open-llm-leaderboard/details_TheBloke__CodeLlama-34B-Python-fp16 | 2023-08-27T12:43:16.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TheBloke/CodeLlama-34B-Python-fp16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/CodeLlama-34B-Python-fp16](https://huggingface.co/TheBloke/CodeLlama-34B-Python-fp16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__CodeLlama-34B-Python-fp16\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-26T02:33:13.745130](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__CodeLlama-34B-Python-fp16/blob/main/results_2023-08-26T02%3A33%3A13.745130.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.32944678557923035,\n\
\ \"acc_stderr\": 0.0339038417486707,\n \"acc_norm\": 0.3306787490661423,\n\
\ \"acc_norm_stderr\": 0.03391015929574356,\n \"mc1\": 0.2778457772337821,\n\
\ \"mc1_stderr\": 0.015680929364024626,\n \"mc2\": 0.43567105267740514,\n\
\ \"mc2_stderr\": 0.014685884652076228\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3575085324232082,\n \"acc_stderr\": 0.01400549427591657,\n\
\ \"acc_norm\": 0.38139931740614336,\n \"acc_norm_stderr\": 0.014194389086685268\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.29924317864967137,\n\
\ \"acc_stderr\": 0.004569906485090286,\n \"acc_norm\": 0.3480382393945429,\n\
\ \"acc_norm_stderr\": 0.004753746951620155\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.32592592592592595,\n\
\ \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.32592592592592595,\n\
\ \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3223684210526316,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.3223684210526316,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.27,\n\
\ \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.27,\n \
\ \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.3660377358490566,\n \"acc_stderr\": 0.02964781353936523,\n\
\ \"acc_norm\": 0.3660377358490566,\n \"acc_norm_stderr\": 0.02964781353936523\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3819444444444444,\n\
\ \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.3819444444444444,\n\
\ \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\"\
: 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3236994219653179,\n\
\ \"acc_stderr\": 0.035676037996391685,\n \"acc_norm\": 0.3236994219653179,\n\
\ \"acc_norm_stderr\": 0.035676037996391685\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n\
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3191489361702128,\n \"acc_stderr\": 0.03047297336338005,\n\
\ \"acc_norm\": 0.3191489361702128,\n \"acc_norm_stderr\": 0.03047297336338005\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n\
\ \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n\
\ \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.037245636197746325,\n\
\ \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.037245636197746325\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25925925925925924,\n \"acc_stderr\": 0.022569897074918417,\n \"\
acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.022569897074918417\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\
\ \"acc_stderr\": 0.04190596438871136,\n \"acc_norm\": 0.3253968253968254,\n\
\ \"acc_norm_stderr\": 0.04190596438871136\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.3580645161290323,\n \"acc_stderr\": 0.027273890594300642,\n \"\
acc_norm\": 0.3580645161290323,\n \"acc_norm_stderr\": 0.027273890594300642\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2019704433497537,\n \"acc_stderr\": 0.028247350122180284,\n \"\
acc_norm\": 0.2019704433497537,\n \"acc_norm_stderr\": 0.028247350122180284\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939098,\n \"acc_norm\"\
: 0.37,\n \"acc_norm_stderr\": 0.04852365870939098\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2545454545454545,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.398989898989899,\n \"acc_stderr\": 0.0348890161685273,\n \"acc_norm\"\
: 0.398989898989899,\n \"acc_norm_stderr\": 0.0348890161685273\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.47668393782383417,\n \"acc_stderr\": 0.03604513672442206,\n\
\ \"acc_norm\": 0.47668393782383417,\n \"acc_norm_stderr\": 0.03604513672442206\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3435897435897436,\n \"acc_stderr\": 0.02407869658063547,\n \
\ \"acc_norm\": 0.3435897435897436,\n \"acc_norm_stderr\": 0.02407869658063547\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2074074074074074,\n \"acc_stderr\": 0.024720713193952158,\n \
\ \"acc_norm\": 0.2074074074074074,\n \"acc_norm_stderr\": 0.024720713193952158\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.36134453781512604,\n \"acc_stderr\": 0.031204691225150016,\n\
\ \"acc_norm\": 0.36134453781512604,\n \"acc_norm_stderr\": 0.031204691225150016\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.25165562913907286,\n \"acc_stderr\": 0.035433042343899844,\n \"\
acc_norm\": 0.25165562913907286,\n \"acc_norm_stderr\": 0.035433042343899844\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3596330275229358,\n \"acc_stderr\": 0.020575234660123783,\n \"\
acc_norm\": 0.3596330275229358,\n \"acc_norm_stderr\": 0.020575234660123783\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.30092592592592593,\n \"acc_stderr\": 0.031280390843298804,\n \"\
acc_norm\": 0.30092592592592593,\n \"acc_norm_stderr\": 0.031280390843298804\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.27450980392156865,\n \"acc_stderr\": 0.03132179803083292,\n \"\
acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.03132179803083292\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.39662447257383965,\n \"acc_stderr\": 0.03184399873811224,\n \
\ \"acc_norm\": 0.39662447257383965,\n \"acc_norm_stderr\": 0.03184399873811224\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.32286995515695066,\n\
\ \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.32286995515695066,\n\
\ \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.29770992366412213,\n \"acc_stderr\": 0.04010358942462203,\n\
\ \"acc_norm\": 0.29770992366412213,\n \"acc_norm_stderr\": 0.04010358942462203\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.33884297520661155,\n \"acc_stderr\": 0.043207678075366705,\n \"\
acc_norm\": 0.33884297520661155,\n \"acc_norm_stderr\": 0.043207678075366705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3055555555555556,\n\
\ \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.3055555555555556,\n\
\ \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22699386503067484,\n \"acc_stderr\": 0.032910995786157686,\n\
\ \"acc_norm\": 0.22699386503067484,\n \"acc_norm_stderr\": 0.032910995786157686\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.24107142857142858,\n\
\ \"acc_stderr\": 0.04059867246952687,\n \"acc_norm\": 0.24107142857142858,\n\
\ \"acc_norm_stderr\": 0.04059867246952687\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.46601941747572817,\n \"acc_stderr\": 0.04939291447273481,\n\
\ \"acc_norm\": 0.46601941747572817,\n \"acc_norm_stderr\": 0.04939291447273481\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.44871794871794873,\n\
\ \"acc_stderr\": 0.0325833464938688,\n \"acc_norm\": 0.44871794871794873,\n\
\ \"acc_norm_stderr\": 0.0325833464938688\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4508301404853129,\n\
\ \"acc_stderr\": 0.01779329757269904,\n \"acc_norm\": 0.4508301404853129,\n\
\ \"acc_norm_stderr\": 0.01779329757269904\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.34104046242774566,\n \"acc_stderr\": 0.025522474632121615,\n\
\ \"acc_norm\": 0.34104046242774566,\n \"acc_norm_stderr\": 0.025522474632121615\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2636871508379888,\n\
\ \"acc_stderr\": 0.014736926383761964,\n \"acc_norm\": 0.2636871508379888,\n\
\ \"acc_norm_stderr\": 0.014736926383761964\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.027826109307283686,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.027826109307283686\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.43086816720257237,\n\
\ \"acc_stderr\": 0.028125340983972718,\n \"acc_norm\": 0.43086816720257237,\n\
\ \"acc_norm_stderr\": 0.028125340983972718\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.3055555555555556,\n \"acc_stderr\": 0.025630824975621344,\n\
\ \"acc_norm\": 0.3055555555555556,\n \"acc_norm_stderr\": 0.025630824975621344\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.25886524822695034,\n \"acc_stderr\": 0.026129572527180848,\n \
\ \"acc_norm\": 0.25886524822695034,\n \"acc_norm_stderr\": 0.026129572527180848\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.26401564537157757,\n\
\ \"acc_stderr\": 0.011258435537723818,\n \"acc_norm\": 0.26401564537157757,\n\
\ \"acc_norm_stderr\": 0.011258435537723818\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4007352941176471,\n \"acc_stderr\": 0.029768263528933102,\n\
\ \"acc_norm\": 0.4007352941176471,\n \"acc_norm_stderr\": 0.029768263528933102\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2875816993464052,\n \"acc_stderr\": 0.018311653053648222,\n \
\ \"acc_norm\": 0.2875816993464052,\n \"acc_norm_stderr\": 0.018311653053648222\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.34545454545454546,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.34545454545454546,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.37142857142857144,\n \"acc_stderr\": 0.030932858792789855,\n\
\ \"acc_norm\": 0.37142857142857144,\n \"acc_norm_stderr\": 0.030932858792789855\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2736318407960199,\n\
\ \"acc_stderr\": 0.03152439186555401,\n \"acc_norm\": 0.2736318407960199,\n\
\ \"acc_norm_stderr\": 0.03152439186555401\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3132530120481928,\n\
\ \"acc_stderr\": 0.03610805018031023,\n \"acc_norm\": 0.3132530120481928,\n\
\ \"acc_norm_stderr\": 0.03610805018031023\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.0381107966983353,\n\
\ \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.0381107966983353\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2778457772337821,\n\
\ \"mc1_stderr\": 0.015680929364024626,\n \"mc2\": 0.43567105267740514,\n\
\ \"mc2_stderr\": 0.014685884652076228\n }\n}\n```"
repo_url: https://huggingface.co/TheBloke/CodeLlama-34B-Python-fp16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|arc:challenge|25_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hellaswag|10_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T02:33:13.745130.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T02:33:13.745130.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T02:33:13.745130.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T02:33:13.745130.parquet'
- config_name: results
data_files:
- split: 2023_08_26T02_33_13.745130
path:
- results_2023-08-26T02:33:13.745130.parquet
- split: latest
path:
- results_2023-08-26T02:33:13.745130.parquet
---
# Dataset Card for Evaluation run of TheBloke/CodeLlama-34B-Python-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/CodeLlama-34B-Python-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/CodeLlama-34B-Python-fp16](https://huggingface.co/TheBloke/CodeLlama-34B-Python-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__CodeLlama-34B-Python-fp16",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-26T02:33:13.745130](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__CodeLlama-34B-Python-fp16/blob/main/results_2023-08-26T02%3A33%3A13.745130.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.32944678557923035,
"acc_stderr": 0.0339038417486707,
"acc_norm": 0.3306787490661423,
"acc_norm_stderr": 0.03391015929574356,
"mc1": 0.2778457772337821,
"mc1_stderr": 0.015680929364024626,
"mc2": 0.43567105267740514,
"mc2_stderr": 0.014685884652076228
},
"harness|arc:challenge|25": {
"acc": 0.3575085324232082,
"acc_stderr": 0.01400549427591657,
"acc_norm": 0.38139931740614336,
"acc_norm_stderr": 0.014194389086685268
},
"harness|hellaswag|10": {
"acc": 0.29924317864967137,
"acc_stderr": 0.004569906485090286,
"acc_norm": 0.3480382393945429,
"acc_norm_stderr": 0.004753746951620155
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3223684210526316,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.3223684210526316,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3660377358490566,
"acc_stderr": 0.02964781353936523,
"acc_norm": 0.3660377358490566,
"acc_norm_stderr": 0.02964781353936523
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3819444444444444,
"acc_stderr": 0.040629907841466674,
"acc_norm": 0.3819444444444444,
"acc_norm_stderr": 0.040629907841466674
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3236994219653179,
"acc_stderr": 0.035676037996391685,
"acc_norm": 0.3236994219653179,
"acc_norm_stderr": 0.035676037996391685
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3191489361702128,
"acc_stderr": 0.03047297336338005,
"acc_norm": 0.3191489361702128,
"acc_norm_stderr": 0.03047297336338005
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.037245636197746325,
"acc_norm": 0.27586206896551724,
"acc_norm_stderr": 0.037245636197746325
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.022569897074918417,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.022569897074918417
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.04190596438871136,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.04190596438871136
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3580645161290323,
"acc_stderr": 0.027273890594300642,
"acc_norm": 0.3580645161290323,
"acc_norm_stderr": 0.027273890594300642
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2019704433497537,
"acc_stderr": 0.028247350122180284,
"acc_norm": 0.2019704433497537,
"acc_norm_stderr": 0.028247350122180284
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.398989898989899,
"acc_stderr": 0.0348890161685273,
"acc_norm": 0.398989898989899,
"acc_norm_stderr": 0.0348890161685273
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.47668393782383417,
"acc_stderr": 0.03604513672442206,
"acc_norm": 0.47668393782383417,
"acc_norm_stderr": 0.03604513672442206
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3435897435897436,
"acc_stderr": 0.02407869658063547,
"acc_norm": 0.3435897435897436,
"acc_norm_stderr": 0.02407869658063547
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2074074074074074,
"acc_stderr": 0.024720713193952158,
"acc_norm": 0.2074074074074074,
"acc_norm_stderr": 0.024720713193952158
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.36134453781512604,
"acc_stderr": 0.031204691225150016,
"acc_norm": 0.36134453781512604,
"acc_norm_stderr": 0.031204691225150016
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.25165562913907286,
"acc_stderr": 0.035433042343899844,
"acc_norm": 0.25165562913907286,
"acc_norm_stderr": 0.035433042343899844
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3596330275229358,
"acc_stderr": 0.020575234660123783,
"acc_norm": 0.3596330275229358,
"acc_norm_stderr": 0.020575234660123783
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.30092592592592593,
"acc_stderr": 0.031280390843298804,
"acc_norm": 0.30092592592592593,
"acc_norm_stderr": 0.031280390843298804
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.03132179803083292,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.03132179803083292
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.39662447257383965,
"acc_stderr": 0.03184399873811224,
"acc_norm": 0.39662447257383965,
"acc_norm_stderr": 0.03184399873811224
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.32286995515695066,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.32286995515695066,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.29770992366412213,
"acc_stderr": 0.04010358942462203,
"acc_norm": 0.29770992366412213,
"acc_norm_stderr": 0.04010358942462203
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.33884297520661155,
"acc_stderr": 0.043207678075366705,
"acc_norm": 0.33884297520661155,
"acc_norm_stderr": 0.043207678075366705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.044531975073749834,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.044531975073749834
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22699386503067484,
"acc_stderr": 0.032910995786157686,
"acc_norm": 0.22699386503067484,
"acc_norm_stderr": 0.032910995786157686
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.24107142857142858,
"acc_stderr": 0.04059867246952687,
"acc_norm": 0.24107142857142858,
"acc_norm_stderr": 0.04059867246952687
},
"harness|hendrycksTest-management|5": {
"acc": 0.46601941747572817,
"acc_stderr": 0.04939291447273481,
"acc_norm": 0.46601941747572817,
"acc_norm_stderr": 0.04939291447273481
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.44871794871794873,
"acc_stderr": 0.0325833464938688,
"acc_norm": 0.44871794871794873,
"acc_norm_stderr": 0.0325833464938688
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.4508301404853129,
"acc_stderr": 0.01779329757269904,
"acc_norm": 0.4508301404853129,
"acc_norm_stderr": 0.01779329757269904
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.34104046242774566,
"acc_stderr": 0.025522474632121615,
"acc_norm": 0.34104046242774566,
"acc_norm_stderr": 0.025522474632121615
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2636871508379888,
"acc_stderr": 0.014736926383761964,
"acc_norm": 0.2636871508379888,
"acc_norm_stderr": 0.014736926383761964
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.027826109307283686,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.027826109307283686
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.43086816720257237,
"acc_stderr": 0.028125340983972718,
"acc_norm": 0.43086816720257237,
"acc_norm_stderr": 0.028125340983972718
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.025630824975621344,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.025630824975621344
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25886524822695034,
"acc_stderr": 0.026129572527180848,
"acc_norm": 0.25886524822695034,
"acc_norm_stderr": 0.026129572527180848
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.26401564537157757,
"acc_stderr": 0.011258435537723818,
"acc_norm": 0.26401564537157757,
"acc_norm_stderr": 0.011258435537723818
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4007352941176471,
"acc_stderr": 0.029768263528933102,
"acc_norm": 0.4007352941176471,
"acc_norm_stderr": 0.029768263528933102
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2875816993464052,
"acc_stderr": 0.018311653053648222,
"acc_norm": 0.2875816993464052,
"acc_norm_stderr": 0.018311653053648222
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.34545454545454546,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.34545454545454546,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.37142857142857144,
"acc_stderr": 0.030932858792789855,
"acc_norm": 0.37142857142857144,
"acc_norm_stderr": 0.030932858792789855
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2736318407960199,
"acc_stderr": 0.03152439186555401,
"acc_norm": 0.2736318407960199,
"acc_norm_stderr": 0.03152439186555401
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3132530120481928,
"acc_stderr": 0.03610805018031023,
"acc_norm": 0.3132530120481928,
"acc_norm_stderr": 0.03610805018031023
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.0381107966983353,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.0381107966983353
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2778457772337821,
"mc1_stderr": 0.015680929364024626,
"mc2": 0.43567105267740514,
"mc2_stderr": 0.014685884652076228
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_ehartford__Samantha-1.11-CodeLlama-34b | 2023-08-27T12:43:18.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of ehartford/Samantha-1.11-CodeLlama-34b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ehartford/Samantha-1.11-CodeLlama-34b](https://huggingface.co/ehartford/Samantha-1.11-CodeLlama-34b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ehartford__Samantha-1.11-CodeLlama-34b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-26T02:57:56.123943](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__Samantha-1.11-CodeLlama-34b/blob/main/results_2023-08-26T02%3A57%3A56.123943.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5358820026141583,\n\
\ \"acc_stderr\": 0.03500868644070597,\n \"acc_norm\": 0.539338751694411,\n\
\ \"acc_norm_stderr\": 0.03499653123588354,\n \"mc1\": 0.33414932680538556,\n\
\ \"mc1_stderr\": 0.016512530677150535,\n \"mc2\": 0.5046063820185052,\n\
\ \"mc2_stderr\": 0.015439454449885362\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5401023890784983,\n \"acc_stderr\": 0.01456431885692485,\n\
\ \"acc_norm\": 0.5656996587030717,\n \"acc_norm_stderr\": 0.014484703048857355\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5763792073292173,\n\
\ \"acc_stderr\": 0.004931219148182242,\n \"acc_norm\": 0.7547301334395539,\n\
\ \"acc_norm_stderr\": 0.0042936778717263336\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\
\ \"acc_stderr\": 0.04304979692464244,\n \"acc_norm\": 0.45925925925925926,\n\
\ \"acc_norm_stderr\": 0.04304979692464244\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.040463368839782514,\n\
\ \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.040463368839782514\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5094339622641509,\n \"acc_stderr\": 0.030767394707808093,\n\
\ \"acc_norm\": 0.5094339622641509,\n \"acc_norm_stderr\": 0.030767394707808093\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04181210050035455,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04181210050035455\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.48554913294797686,\n\
\ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.48554913294797686,\n\
\ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.49361702127659574,\n \"acc_stderr\": 0.032683358999363366,\n\
\ \"acc_norm\": 0.49361702127659574,\n \"acc_norm_stderr\": 0.032683358999363366\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n\
\ \"acc_stderr\": 0.046151869625837026,\n \"acc_norm\": 0.40350877192982454,\n\
\ \"acc_norm_stderr\": 0.046151869625837026\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3941798941798942,\n \"acc_stderr\": 0.02516798233389414,\n \"\
acc_norm\": 0.3941798941798942,\n \"acc_norm_stderr\": 0.02516798233389414\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5903225806451613,\n\
\ \"acc_stderr\": 0.027976054915347357,\n \"acc_norm\": 0.5903225806451613,\n\
\ \"acc_norm_stderr\": 0.027976054915347357\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3891625615763547,\n \"acc_stderr\": 0.034304624161038716,\n\
\ \"acc_norm\": 0.3891625615763547,\n \"acc_norm_stderr\": 0.034304624161038716\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6606060606060606,\n \"acc_stderr\": 0.03697442205031596,\n\
\ \"acc_norm\": 0.6606060606060606,\n \"acc_norm_stderr\": 0.03697442205031596\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6565656565656566,\n \"acc_stderr\": 0.03383201223244442,\n \"\
acc_norm\": 0.6565656565656566,\n \"acc_norm_stderr\": 0.03383201223244442\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7202072538860104,\n \"acc_stderr\": 0.032396370467357036,\n\
\ \"acc_norm\": 0.7202072538860104,\n \"acc_norm_stderr\": 0.032396370467357036\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.45384615384615384,\n \"acc_stderr\": 0.025242770987126177,\n\
\ \"acc_norm\": 0.45384615384615384,\n \"acc_norm_stderr\": 0.025242770987126177\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.542016806722689,\n \"acc_stderr\": 0.03236361111951941,\n \
\ \"acc_norm\": 0.542016806722689,\n \"acc_norm_stderr\": 0.03236361111951941\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.671559633027523,\n \"acc_stderr\": 0.02013590279729841,\n \"acc_norm\"\
: 0.671559633027523,\n \"acc_norm_stderr\": 0.02013590279729841\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.37037037037037035,\n\
\ \"acc_stderr\": 0.03293377139415191,\n \"acc_norm\": 0.37037037037037035,\n\
\ \"acc_norm_stderr\": 0.03293377139415191\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.7009803921568627,\n \"acc_stderr\": 0.03213325717373617,\n\
\ \"acc_norm\": 0.7009803921568627,\n \"acc_norm_stderr\": 0.03213325717373617\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.759493670886076,\n \"acc_stderr\": 0.027820781981149685,\n \
\ \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.027820781981149685\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5829596412556054,\n\
\ \"acc_stderr\": 0.03309266936071721,\n \"acc_norm\": 0.5829596412556054,\n\
\ \"acc_norm_stderr\": 0.03309266936071721\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.549618320610687,\n \"acc_stderr\": 0.04363643698524779,\n\
\ \"acc_norm\": 0.549618320610687,\n \"acc_norm_stderr\": 0.04363643698524779\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6859504132231405,\n \"acc_stderr\": 0.042369647530410184,\n \"\
acc_norm\": 0.6859504132231405,\n \"acc_norm_stderr\": 0.042369647530410184\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n\
\ \"acc_stderr\": 0.04414343666854933,\n \"acc_norm\": 0.7037037037037037,\n\
\ \"acc_norm_stderr\": 0.04414343666854933\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6871165644171779,\n \"acc_stderr\": 0.036429145782924055,\n\
\ \"acc_norm\": 0.6871165644171779,\n \"acc_norm_stderr\": 0.036429145782924055\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7863247863247863,\n\
\ \"acc_stderr\": 0.026853450377009168,\n \"acc_norm\": 0.7863247863247863,\n\
\ \"acc_norm_stderr\": 0.026853450377009168\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7088122605363985,\n\
\ \"acc_stderr\": 0.016246087069701393,\n \"acc_norm\": 0.7088122605363985,\n\
\ \"acc_norm_stderr\": 0.016246087069701393\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5780346820809249,\n \"acc_stderr\": 0.02658923114217426,\n\
\ \"acc_norm\": 0.5780346820809249,\n \"acc_norm_stderr\": 0.02658923114217426\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36983240223463687,\n\
\ \"acc_stderr\": 0.01614588125605622,\n \"acc_norm\": 0.36983240223463687,\n\
\ \"acc_norm_stderr\": 0.01614588125605622\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5228758169934641,\n \"acc_stderr\": 0.028599936776089768,\n\
\ \"acc_norm\": 0.5228758169934641,\n \"acc_norm_stderr\": 0.028599936776089768\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.594855305466238,\n\
\ \"acc_stderr\": 0.027882383791325953,\n \"acc_norm\": 0.594855305466238,\n\
\ \"acc_norm_stderr\": 0.027882383791325953\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6049382716049383,\n \"acc_stderr\": 0.027201117666925657,\n\
\ \"acc_norm\": 0.6049382716049383,\n \"acc_norm_stderr\": 0.027201117666925657\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.36879432624113473,\n \"acc_stderr\": 0.028782227561347237,\n \
\ \"acc_norm\": 0.36879432624113473,\n \"acc_norm_stderr\": 0.028782227561347237\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.37222946544980445,\n\
\ \"acc_stderr\": 0.012346241297204368,\n \"acc_norm\": 0.37222946544980445,\n\
\ \"acc_norm_stderr\": 0.012346241297204368\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4227941176470588,\n \"acc_stderr\": 0.030008562845003483,\n\
\ \"acc_norm\": 0.4227941176470588,\n \"acc_norm_stderr\": 0.030008562845003483\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.48366013071895425,\n \"acc_stderr\": 0.020217030653186457,\n \
\ \"acc_norm\": 0.48366013071895425,\n \"acc_norm_stderr\": 0.020217030653186457\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n\
\ \"acc_stderr\": 0.04673752333670238,\n \"acc_norm\": 0.6090909090909091,\n\
\ \"acc_norm_stderr\": 0.04673752333670238\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6040816326530613,\n \"acc_stderr\": 0.03130802899065686,\n\
\ \"acc_norm\": 0.6040816326530613,\n \"acc_norm_stderr\": 0.03130802899065686\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.033333333333333326,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.033333333333333326\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4036144578313253,\n\
\ \"acc_stderr\": 0.038194861407583984,\n \"acc_norm\": 0.4036144578313253,\n\
\ \"acc_norm_stderr\": 0.038194861407583984\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7485380116959064,\n \"acc_stderr\": 0.033275044238468436,\n\
\ \"acc_norm\": 0.7485380116959064,\n \"acc_norm_stderr\": 0.033275044238468436\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.33414932680538556,\n\
\ \"mc1_stderr\": 0.016512530677150535,\n \"mc2\": 0.5046063820185052,\n\
\ \"mc2_stderr\": 0.015439454449885362\n }\n}\n```"
repo_url: https://huggingface.co/ehartford/Samantha-1.11-CodeLlama-34b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|arc:challenge|25_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hellaswag|10_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T02:57:56.123943.parquet'
- config_name: results
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- results_2023-08-26T02:57:56.123943.parquet
- split: latest
path:
- results_2023-08-26T02:57:56.123943.parquet
---
# Dataset Card for Evaluation run of ehartford/Samantha-1.11-CodeLlama-34b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ehartford/Samantha-1.11-CodeLlama-34b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ehartford/Samantha-1.11-CodeLlama-34b](https://huggingface.co/ehartford/Samantha-1.11-CodeLlama-34b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ehartford__Samantha-1.11-CodeLlama-34b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-26T02:57:56.123943](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__Samantha-1.11-CodeLlama-34b/blob/main/results_2023-08-26T02%3A57%3A56.123943.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5358820026141583,
"acc_stderr": 0.03500868644070597,
"acc_norm": 0.539338751694411,
"acc_norm_stderr": 0.03499653123588354,
"mc1": 0.33414932680538556,
"mc1_stderr": 0.016512530677150535,
"mc2": 0.5046063820185052,
"mc2_stderr": 0.015439454449885362
},
"harness|arc:challenge|25": {
"acc": 0.5401023890784983,
"acc_stderr": 0.01456431885692485,
"acc_norm": 0.5656996587030717,
"acc_norm_stderr": 0.014484703048857355
},
"harness|hellaswag|10": {
"acc": 0.5763792073292173,
"acc_stderr": 0.004931219148182242,
"acc_norm": 0.7547301334395539,
"acc_norm_stderr": 0.0042936778717263336
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464244,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464244
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.040463368839782514,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.040463368839782514
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5094339622641509,
"acc_stderr": 0.030767394707808093,
"acc_norm": 0.5094339622641509,
"acc_norm_stderr": 0.030767394707808093
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5,
"acc_stderr": 0.04181210050035455,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04181210050035455
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.48554913294797686,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.48554913294797686,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.49361702127659574,
"acc_stderr": 0.032683358999363366,
"acc_norm": 0.49361702127659574,
"acc_norm_stderr": 0.032683358999363366
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.046151869625837026,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.046151869625837026
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3941798941798942,
"acc_stderr": 0.02516798233389414,
"acc_norm": 0.3941798941798942,
"acc_norm_stderr": 0.02516798233389414
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5903225806451613,
"acc_stderr": 0.027976054915347357,
"acc_norm": 0.5903225806451613,
"acc_norm_stderr": 0.027976054915347357
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3891625615763547,
"acc_stderr": 0.034304624161038716,
"acc_norm": 0.3891625615763547,
"acc_norm_stderr": 0.034304624161038716
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6606060606060606,
"acc_stderr": 0.03697442205031596,
"acc_norm": 0.6606060606060606,
"acc_norm_stderr": 0.03697442205031596
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6565656565656566,
"acc_stderr": 0.03383201223244442,
"acc_norm": 0.6565656565656566,
"acc_norm_stderr": 0.03383201223244442
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7202072538860104,
"acc_stderr": 0.032396370467357036,
"acc_norm": 0.7202072538860104,
"acc_norm_stderr": 0.032396370467357036
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.45384615384615384,
"acc_stderr": 0.025242770987126177,
"acc_norm": 0.45384615384615384,
"acc_norm_stderr": 0.025242770987126177
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.028972648884844267,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.028972648884844267
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.542016806722689,
"acc_stderr": 0.03236361111951941,
"acc_norm": 0.542016806722689,
"acc_norm_stderr": 0.03236361111951941
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.671559633027523,
"acc_stderr": 0.02013590279729841,
"acc_norm": 0.671559633027523,
"acc_norm_stderr": 0.02013590279729841
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.03293377139415191,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.03293377139415191
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7009803921568627,
"acc_stderr": 0.03213325717373617,
"acc_norm": 0.7009803921568627,
"acc_norm_stderr": 0.03213325717373617
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.027820781981149685,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.027820781981149685
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5829596412556054,
"acc_stderr": 0.03309266936071721,
"acc_norm": 0.5829596412556054,
"acc_norm_stderr": 0.03309266936071721
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.549618320610687,
"acc_stderr": 0.04363643698524779,
"acc_norm": 0.549618320610687,
"acc_norm_stderr": 0.04363643698524779
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6859504132231405,
"acc_stderr": 0.042369647530410184,
"acc_norm": 0.6859504132231405,
"acc_norm_stderr": 0.042369647530410184
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.04414343666854933,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.04414343666854933
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6871165644171779,
"acc_stderr": 0.036429145782924055,
"acc_norm": 0.6871165644171779,
"acc_norm_stderr": 0.036429145782924055
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7863247863247863,
"acc_stderr": 0.026853450377009168,
"acc_norm": 0.7863247863247863,
"acc_norm_stderr": 0.026853450377009168
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7088122605363985,
"acc_stderr": 0.016246087069701393,
"acc_norm": 0.7088122605363985,
"acc_norm_stderr": 0.016246087069701393
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.02658923114217426,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.02658923114217426
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.36983240223463687,
"acc_stderr": 0.01614588125605622,
"acc_norm": 0.36983240223463687,
"acc_norm_stderr": 0.01614588125605622
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5228758169934641,
"acc_stderr": 0.028599936776089768,
"acc_norm": 0.5228758169934641,
"acc_norm_stderr": 0.028599936776089768
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.594855305466238,
"acc_stderr": 0.027882383791325953,
"acc_norm": 0.594855305466238,
"acc_norm_stderr": 0.027882383791325953
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6049382716049383,
"acc_stderr": 0.027201117666925657,
"acc_norm": 0.6049382716049383,
"acc_norm_stderr": 0.027201117666925657
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36879432624113473,
"acc_stderr": 0.028782227561347237,
"acc_norm": 0.36879432624113473,
"acc_norm_stderr": 0.028782227561347237
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.37222946544980445,
"acc_stderr": 0.012346241297204368,
"acc_norm": 0.37222946544980445,
"acc_norm_stderr": 0.012346241297204368
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4227941176470588,
"acc_stderr": 0.030008562845003483,
"acc_norm": 0.4227941176470588,
"acc_norm_stderr": 0.030008562845003483
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.48366013071895425,
"acc_stderr": 0.020217030653186457,
"acc_norm": 0.48366013071895425,
"acc_norm_stderr": 0.020217030653186457
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.04673752333670238,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.04673752333670238
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6040816326530613,
"acc_stderr": 0.03130802899065686,
"acc_norm": 0.6040816326530613,
"acc_norm_stderr": 0.03130802899065686
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.033333333333333326,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.033333333333333326
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4036144578313253,
"acc_stderr": 0.038194861407583984,
"acc_norm": 0.4036144578313253,
"acc_norm_stderr": 0.038194861407583984
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7485380116959064,
"acc_stderr": 0.033275044238468436,
"acc_norm": 0.7485380116959064,
"acc_norm_stderr": 0.033275044238468436
},
"harness|truthfulqa:mc|0": {
"mc1": 0.33414932680538556,
"mc1_stderr": 0.016512530677150535,
"mc2": 0.5046063820185052,
"mc2_stderr": 0.015439454449885362
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
terrytengli/pokemon-rand-redrect | 2023-08-26T03:35:41.000Z | [
"region:us"
] | terrytengli | null | null | null | 0 | 0 | Entry not found |
terrytengli/pokemon-original | 2023-08-26T03:36:09.000Z | [
"region:us"
] | terrytengli | null | null | null | 0 | 0 | Entry not found |
Nater-EX99/Sylphynford_Tachibana_Unprocessed | 2023-08-26T04:22:22.000Z | [
"license:other",
"region:us"
] | Nater-EX99 | null | null | null | 0 | 0 | ---
license: other
---
|
quoctrungle/hcmus_QA_train | 2023-08-26T04:15:32.000Z | [
"region:us"
] | quoctrungle | null | null | null | 0 | 0 | Entry not found |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.